Neither convolution nor recurrent layers of deep learning enable non-linearity. Activation functions enable neural networks to become non-linear. An activation … More
Tag: activation function
Mish As Neural Networks Activation Function
Recently, Mish activation function is announced in deep learning world. Researchers report that it overperforms than both regular ReLU and Swish. The … More
Using Custom Activation Functions in Keras
Almost every day a new innovation is announced in ML field. Such an extent that number of research papers published … More
Hyperbolic Secant As Neural Networks Activation Function
Hyperbolic functions are common activation functions in neural networks. Previously, we have mentioned hyperbolic tangent as activation function. Now, we … More
Swish as Neural Networks Activation Function
Google brain team announced Swish activation function as an alternative to ReLU in 2017. Actually, ReLU was the solution for second AI … More
Leaky ReLU as a Neural Networks Activation Function
Convolutional neural networks make ReLU activation function so popular. Common alternatives such as sigmoid or tanh have upper limits to … More
Sinc as a Neural Networks Activation Function
Sinc function is a sinusoidal activation function in neural networks. In contrast to other common activation functions, it has rises … More
ELU as a Neural Networks Activation Function
Recently a new activation function named Exponential Linear Unit or its widely known name ELU was introduced. Researchs reveal that … More
A Gentle Introduction to Cross-Entropy Loss Function
Neural networks produce multiple outputs in multi-class classification problems. However, they do not have ability to produce exact outputs, they … More
Logarithm of Sigmoid As a Neural Networks Activation Function
Previously, we’ve reviewed sigmoid function as activation function for neural networks. Logarithm of sigmoid states it modified version. Unlike to … More