Convolutional neural networks make ReLU activation function so popular. Common alternatives such as sigmoid or tanh have upper limits to … More
Author: Sefik Serengil
Elegant Signatures with Elliptic Curve Cryptography
Elegance is the only beauty that never fades. This is a loving Audrey Hepburn quote. I struggle to adapt this … More
Moving Numbers To Upside Down: Extended Euclidean Algorithm
You might be familiar with the upside down if you watched Netflix series Stranger Things. Eleven shows the underside of … More
Real Time Facial Expression Recognition on Streaming Data
Previously, we’ve worked on facial expression recognition of a custom image. Additionally, we can detect multiple faces in a image, … More
Sinc as a Neural Networks Activation Function
Sinc function is a sinusoidal activation function in neural networks. In contrast to other common activation functions, it has rises … More
ELU as a Neural Networks Activation Function
Recently a new activation function named Exponential Linear Unit or its widely known name ELU was introduced. Researchs reveal that … More
Facial Expression Recognition with Keras
Kaggle announced facial expression recognition challenge in 2013. Researchers are expected to create models to detect 7 different emotions from human … More
A Gentle Introduction to Cross-Entropy Loss Function
Neural networks produce multiple outputs in multi-class classification problems. However, they do not have ability to produce exact outputs, they … More
Transfer Learning in Keras Using Inception V3
Machine learning researchers would like to share outcomes. They might spend a lot of time to construct a neural networks … More
Logarithm of Sigmoid As a Neural Networks Activation Function
Previously, we’ve reviewed sigmoid function as activation function for neural networks. Logarithm of sigmoid states it modified version. Unlike to … More