Skip to content

  • Twitter

  • Youtube

  • GitHub

  • Linkedin

  • Facebook

  • Instagram

  • RSS

  • Mail

Sefik Ilkin Serengil

Code wins arguments

  • Home
  • About Me
  • Publications
  • Courses
  • Talks
  • Consultancy

Tag: neural networks

Softsign as a Neural Networks Activation Function

Activation functions play pivotal role in neural networks. As an alternative to hyperbolic tangent, softsign is an activation function for … More

activation function, derivative, neural networks

Softmax as a Neural Networks Activation Function

In fact, convolutional neural networks popularize softmax so much as an activation function. However, softmax is not a traditional activation function. … More

activation function, derivative, neural networks

A Gentle Introduction to Convolutional Neural Networks

Convolutional neural networks (aka CNN and ConvNet) are modified version of traditional neural networks. These networks have wide and deep … More

cnn, convnet, convolution, deep learning, neural networks

Evolution of Neural Networks

Today, AI lives its golden age whereas neural networks make a great contribution to it. Neural networks change our lifes … More

history, neural networks, svm

ReLU as Neural Networks Activation Function

Rectifier linear unit or its more widely known name as ReLU becomes popular for the past several years since its … More

activation function, neural networks

Softplus as a Neural Networks Activation Function

Activation unit calculates the net output of a neural cell in neural networks. Backpropagation algorithm multiplies the derivative of the … More

activation function, neural networks

Transparency in AI

Until now, authorities describe machine learning algorithms based on the learning style which are supervised or unsupervised. Rob Walker brings … More

decision tree, genetic algorithms, neural networks, pegasystems, random forest

Homer Simpson Guide to Backpropagation

  Backpropagation algorithm is based on complex mathematical calculations. That’s why, it is hard to understand and that is the … More

backpropagation, batman, homer simpson, neural networks

Step Function as a Neural Network Activation Function

Activation functions are decision making units of neural networks. They calculates net output of a neural node. Herein, heaviside step … More

activation function, backpropagation, heaviside, neural networks

Adaptive Learning in Neural Networks

Gradient descent is one of the most powerful optimizing method. However, learning time is a challange, too. Standard version of gradient … More

adaptive learning rate, back propagation, gradient descent, neural networks

Posts navigation

Older posts
Newer posts


Licensed under a Creative Commons Attribution 4.0 International License.

Creative Commons License

You can use any content of this blog just to the extent that you cite or reference

Subscribe to sefiks.com to stay up-to-date with latest posts

🎭 DeepFace is THE BEST facial recognition library for Python 🐍 It wraps many cutting-edge models that reaches and passes human-level accuracy 💪


BTW, Haven't you subscribe to my YouTube channel yet?