Skip to content

  • Twitter

  • Youtube

  • GitHub

  • Linkedin

  • Facebook

  • Instagram

  • RSS

  • Mail

Sefik Ilkin Serengil

Code wins arguments

  • Home
  • About Me
  • Publications
  • Courses
  • Talks
  • Consultancy

Tag: activation function

Dance Moves of Deep Learning Activation Functions

Neither convolution nor recurrent layers of deep learning enable non-linearity. Activation functions enable neural networks to become non-linear. An activation … More

activation function, deep learning

Mish As Neural Networks Activation Function

Recently, Mish activation function is announced in deep learning world. Researchers report that it overperforms than both regular ReLU and Swish. The … More

activation, activation function, backpropagation, derivative, keras, mish, neural networks, python, softplus, tanh

Using Custom Activation Functions in Keras

Almost every day a new innovation is announced in ML field. Such an extent that number of research papers published … More

activation function, e-swish, keras, swish

Hyperbolic Secant As Neural Networks Activation Function

Hyperbolic functions are common activation functions in neural networks. Previously, we have mentioned hyperbolic tangent as activation function. Now, we … More

activation function

Swish as Neural Networks Activation Function

Google brain team announced Swish activation function as an alternative to ReLU in 2017. Actually, ReLU was the solution for second AI … More

activation function

Leaky ReLU as a Neural Networks Activation Function

Convolutional neural networks make ReLU activation function so popular. Common alternatives such as sigmoid or tanh have upper limits to … More

activation function, LReLU, PReLU

Sinc as a Neural Networks Activation Function

Sinc function is a sinusoidal activation function in neural networks. In contrast to other common activation functions, it has rises … More

activation function, sine wave

ELU as a Neural Networks Activation Function

Recently a new activation function named Exponential Linear Unit or its widely known name ELU was introduced. Researchs reveal that … More

activation function

A Gentle Introduction to Cross-Entropy Loss Function

Neural networks produce multiple outputs in multi-class classification problems. However, they do not have ability to produce exact outputs, they … More

activation function, error function, loss function

Logarithm of Sigmoid As a Neural Networks Activation Function

Previously, we’ve reviewed sigmoid function as activation function for neural networks. Logarithm of sigmoid states it modified version. Unlike to … More

activation function, derivative, neural networks

Posts navigation

Older posts


Licensed under a Creative Commons Attribution 4.0 International License.

Creative Commons License

You can use any content of this blog just to the extent that you cite or reference

Subscribe to sefiks.com to stay up-to-date with latest posts

🎭 DeepFace is THE BEST facial recognition library for Python 🐍 It wraps many cutting-edge models that reaches and passes human-level accuracy 💪


BTW, Haven't you subscribe to my YouTube channel yet?