Recently, Mish activation function is announced in deep learning world. Researchers report that it overperforms than both regular ReLU and Swish. The … More
Tag: neural networks
5 Facts about Deep Learning and Neural Networks
Marketing staff are much more successful than engineers for things to be adopted. Even for engineering marvels. People working in … More
The Insider’s Guide to Adam Optimization Algorithm for Deep Learning
Adam is the super star optimization algorithm of Deep Learning. Optimization algorithms aim to find optimum weights, minimize error and … More
Random Initialization in Neural Networks
Neural networks requires to apply several state-of-the-art techniques such as choice of activation function, or network design to push their … More
How Vectorization Saves Life in Neural Networks
Developers tend to handle problems with conditional statements and loops. This is the number one topic of developers and data … More
Convolutional Autoencoder: Clustering Images with Neural Networks
Previously, we’ve applied conventional autoencoder to handwritten digit database (MNIST). That approach was pretty. We can apply same model to … More
Autoencoder: Neural Networks For Unsupervised Learning
Neural networks are like swiss army knifes. They can solve both classification and regression problems. Surprisingly, they can also contribute … More
Handling Overfitting with Dropout in Neural Networks
Overfitting is trouble maker for neural networks. Designing too complex neural networks structure could cause overfitting. So, dropout is introduced to … More
Facial Expression Recognition with Keras
Kaggle announced facial expression recognition challenge in 2013. Researchers are expected to create models to detect 7 different emotions from human … More
Logarithm of Sigmoid As a Neural Networks Activation Function
Previously, we’ve reviewed sigmoid function as activation function for neural networks. Logarithm of sigmoid states it modified version. Unlike to … More