Skip to content

  • Twitter

  • Youtube

  • GitHub

  • Linkedin

  • Facebook

  • Instagram

  • RSS

  • Mail

Sefik Ilkin Serengil

Code wins arguments

  • Home
  • About Me
  • Publications
  • Courses
  • Talks
  • Consultancy

Tag: neural networks

Mish As Neural Networks Activation Function

Recently, Mish activation function is announced in deep learning world. Researchers report that it overperforms than both regular ReLU and Swish. The … More

activation, activation function, backpropagation, derivative, keras, mish, neural networks, python, softplus, tanh

5 Facts about Deep Learning and Neural Networks

Marketing staff are much more successful than engineers for things to be adopted. Even for engineering marvels. People working in … More

deep learning, marketing, neural networks

The Insider’s Guide to Adam Optimization Algorithm for Deep Learning

Adam is the super star optimization algorithm of Deep Learning. Optimization algorithms aim to find optimum weights, minimize error and … More

adam, backpropagation, gradient descent, neural networks, optimization

Random Initialization in Neural Networks

Neural networks requires to apply several state-of-the-art techniques such as choice of activation function, or network design to push their … More

neural networks, python

How Vectorization Saves Life in Neural Networks

Developers tend to handle problems with conditional statements and loops. This is the number one topic of developers and data … More

neural networks, numpy, python, vectorization

Convolutional Autoencoder: Clustering Images with Neural Networks

Previously, we’ve applied conventional autoencoder to handwritten digit database (MNIST). That approach was pretty. We can apply same model to … More

autoencoder, neural networks, unsupervised learning

Autoencoder: Neural Networks For Unsupervised Learning

Neural networks are like swiss army knifes. They can solve both classification and regression problems. Surprisingly, they can also contribute … More

autoencoder, neural networks, unsupervised learning

Handling Overfitting with Dropout in Neural Networks

Overfitting is trouble maker for neural networks. Designing too complex neural networks structure could cause overfitting. So, dropout is introduced to … More

hinton, keras, neural networks, tensorflow

Facial Expression Recognition with Keras

Kaggle announced facial expression recognition challenge in 2013. Researchers are expected to create models to detect 7 different emotions from human … More

cnn, kaggle, keras, neural networks, tensorflow

Logarithm of Sigmoid As a Neural Networks Activation Function

Previously, we’ve reviewed sigmoid function as activation function for neural networks. Logarithm of sigmoid states it modified version. Unlike to … More

activation function, derivative, neural networks

Posts navigation

Older posts


Licensed under a Creative Commons Attribution 4.0 International License.

Creative Commons License

You can use any content of this blog just to the extent that you cite or reference

Subscribe to sefiks.com to stay up-to-date with latest posts

🎭 DeepFace is THE BEST facial recognition library for Python 🐍 It wraps many cutting-edge models that reaches and passes human-level accuracy 💪


BTW, Haven't you subscribe to my YouTube channel yet?