Recently, Mish activation function is announced in deep learning world. Researchers report that it overperforms than both regular ReLU and Swish. The … More
Tag: backpropagation
The Insider’s Guide to Adam Optimization Algorithm for Deep Learning
Adam is the super star optimization algorithm of Deep Learning. Optimization algorithms aim to find optimum weights, minimize error and … More
Homer Simpson Guide to Backpropagation
Backpropagation algorithm is based on complex mathematical calculations. That’s why, it is hard to understand and that is the … More
Step Function as a Neural Network Activation Function
Activation functions are decision making units of neural networks. They calculates net output of a neural node. Herein, heaviside step … More
Backpropagation Implementation: Neural Networks Learning From Theory To Action
We’ve focused on the math behind neural networks learning and proof of the backpropagation algorithm. Let’s face it, mathematical background … More
The Math Behind Neural Networks Learning with Backpropagation
Neural networks are one of the most powerful machine learning algorithm. However, its background might confuse brains because of complex … More