Decision trees are still hot topics nowadays in data science world. Here, ID3 is the most common conventional decision tree algorithm … More
Category: Machine Learning
Random Initialization in Neural Networks
Neural networks requires to apply several state-of-the-art techniques such as choice of activation function, or network design to push their … More
How Vectorization Saves Life in Neural Networks
Developers tend to handle problems with conditional statements and loops. This is the number one topic of developers and data … More
Convolutional Autoencoder: Clustering Images with Neural Networks
Previously, we’ve applied conventional autoencoder to handwritten digit database (MNIST). That approach was pretty. We can apply same model to … More
Autoencoder: Neural Networks For Unsupervised Learning
Neural networks are like swiss army knifes. They can solve both classification and regression problems. Surprisingly, they can also contribute … More
Handling Overfitting with Dropout in Neural Networks
Overfitting is trouble maker for neural networks. Designing too complex neural networks structure could cause overfitting. So, dropout is introduced to … More
Oracle Analytics Summit 2018 Istanbul
Oracle Analytics Summit is held in Istanbul on 2018 Mar, 07. I have been attending this summit since 2016. I … More
Leaky ReLU as a Neural Networks Activation Function
Convolutional neural networks make ReLU activation function so popular. Common alternatives such as sigmoid or tanh have upper limits to … More
Real Time Facial Expression Recognition on Streaming Data
Previously, we’ve worked on facial expression recognition of a custom image. Additionally, we can detect multiple faces in a image, … More
Sinc as a Neural Networks Activation Function
Sinc function is a sinusoidal activation function in neural networks. In contrast to other common activation functions, it has rises … More
