ReLUs: An Alternative to the ReLU Activation Function

12:30/13:10

In an attempt to reduce the impact caused by the Dying ReLU issue on Neural Networks and improve accuracy, the Rectified Linear Unit with Sigmoid (ReLUs) does not flatten out negative weights to zero, instead, it combines them with a sigmoid call and an extra hyper-parameter, thus enabling differentiability. This approach is also superior to the Leaky-ReLU implementation due to the oscillations that it adds to negative weights. it uses the difference between the sigmoid of sine and cosine functions to create a wave, thus deactivating some neurons during the forward pass, but not all of them.

Language: English

Level: Intermediate

Wilder Rodrigues

Senior Data Scientist - VodafoneZiggo

With 20+ years of experience in Software Engineering plus a strong passion and know-how in the Artificial Intelligence field, Mr. Wilder Rodrigues has contributed extensively to the AI community in The Netherlands. He is currently an Ambassador of the City.AI Global Community, representing the Amsterdam.AI chapter. In addition to that, he is one of 30 finalists of the IBM Watson AI XPRIZE competition, along with the t2h2o.com team; father of the SineReLU activation function, which is part of Keras framework; and Senior Data Scientist at Vodafone-Ziggo.

Go to speaker's detail