Fermat contributes construction of modern math with well known theorems. Today, we are going to mention the little one. This is … More
Category: Math
Leaky ReLU as a Neural Networks Activation Function
Convolutional neural networks make ReLU activation function so popular. Common alternatives such as sigmoid or tanh have upper limits to … More
Moving Numbers To Upside Down: Extended Euclidean Algorithm
You might be familiar with the upside down if you watched Netflix series Stranger Things. Eleven shows the underside of … More
Sinc as a Neural Networks Activation Function
Sinc function is a sinusoidal activation function in neural networks. In contrast to other common activation functions, it has rises … More
ELU as a Neural Networks Activation Function
Recently a new activation function named Exponential Linear Unit or its widely known name ELU was introduced. Researchs reveal that … More
A Gentle Introduction to Cross-Entropy Loss Function
Neural networks produce multiple outputs in multi-class classification problems. However, they do not have ability to produce exact outputs, they … More
Logarithm of Sigmoid As a Neural Networks Activation Function
Previously, we’ve reviewed sigmoid function as activation function for neural networks. Logarithm of sigmoid states it modified version. Unlike to … More
Softsign as a Neural Networks Activation Function
Activation functions play pivotal role in neural networks. As an alternative to hyperbolic tangent, softsign is an activation function for … More
Developers vs Mathematicians
Fermat’s last theorem has been waiting almost 350 years to be proven. Now, we can call it theorem but until 90s … More
ReLU as Neural Networks Activation Function
Rectifier linear unit or its more widely known name as ReLU becomes popular for the past several years since its … More