ReLUs: An Alternative to the ReLU Activation Function

12:30/13:10

In an attempt to reduce the impact caused by the Dying ReLU issue on Neural Networks and improve accuracy, the Rectified Linear Unit with Sigmoid (ReLUs) does not flatten out negative weights to zero, instead, it combines them with a sigmoid call and an extra hyper-parameter, thus enabling differentiability. This approach is also superior to the Leaky-ReLU implementation due to the oscillations that it adds to negative weights. it uses the difference between the sigmoid of sine and cosine functions to create a wave, thus deactivating some neurons during the forward pass, but not all of them.

Language: English

Level: Intermediate

Wilder Rodrigues

Machine Learning Engineer - Quby

With a Computer Science background, 25 years developing standalone, distributed and mobile systems, plus a passion for Artificial Intelligence, Wilder has built a stronghold of knowledge which he loves to share with communities. His passion for AI, and being a contestant at the IBM Watson AI XPRIZE competition, has brought him to the AI for Good Summit, in Geneva at the United Nations, as a guest attendee. Wilder is currently Ambassador at the City.A, which is a large community present in 40+ cities.

Go to speaker's detail