Any number to the power of zero is 1. Meanwhile, zero to the power of any number is 0. Combination … More
Tag: derivative
Mish As Neural Networks Activation Function
Recently, Mish activation function is announced in deep learning world. Researchers report that it overperforms than both regular ReLU and Swish. The … More
Logarithm of Sigmoid As a Neural Networks Activation Function
Previously, we’ve reviewed sigmoid function as activation function for neural networks. Logarithm of sigmoid states it modified version. Unlike to … More
Softsign as a Neural Networks Activation Function
Activation functions play pivotal role in neural networks. As an alternative to hyperbolic tangent, softsign is an activation function for … More
Softmax as a Neural Networks Activation Function
In fact, convolutional neural networks popularize softmax so much as an activation function. However, softmax is not a traditional activation function. … More