ReLUs: An Alternative to the ReLU Activation Function

12:30/13:10

In an attempt to reduce the impact caused by the Dying ReLU issue on Neural Networks and improve accuracy, the Rectified Linear Unit with Sigmoid (ReLUs) does not flatten out negative weights to zero, instead, it combines them with a sigmoid call and an extra hyper-parameter, thus enabling differentiability. This approach is also superior to the Leaky-ReLU implementation due to the oscillations that it adds to negative weights. it uses the difference between the sigmoid of sine and cosine functions to create a wave, thus deactivating some neurons during the forward pass, but not all of them.

Language: English

Level: Intermediate

Wilder Rodrigues

Founder - ekholabs

Put together 25 years of experience with Software Engineering plus passion & practice in the Artificial Intelligence field and you get Wilder. He is strongly involved with the AI community in Amsterdam, Utrecht and Enschede, part of the 30 finalists of the IBM Watson AI XPRIZE competition with the t2h2o.com team, father of the SineReLU activation function, which is part of Keras-Contrib, Artificial Intelligence Engineer at Transport Learning LLC and aigent.com and last, but not least, founder at ekholabs.

Go to speaker's detail