Skip to content

  • Twitter

  • Youtube

  • GitHub

  • Linkedin

  • Facebook

  • Instagram

  • RSS

  • Mail

Sefik Ilkin Serengil

Code wins arguments

  • Home
  • About Me
  • Publications
  • Courses
  • Talks
  • Consultancy

Category: Math

Unofficial Guide to Fermat’s Little Theorem

Fermat contributes construction of modern math with well known theorems. Today, we are going to mention the little one. This is … More

fermat, rsa

Leaky ReLU as a Neural Networks Activation Function

Convolutional neural networks make ReLU activation function so popular. Common alternatives such as sigmoid or tanh have upper limits to … More

activation function, LReLU, PReLU

Moving Numbers To Upside Down: Extended Euclidean Algorithm

You might be familiar with the upside down if you watched Netflix series Stranger Things. Eleven shows the underside of … More

big o notation, euclidean, Java, modular inverse, multiplicative inverse, python, rsa, stranger things

Sinc as a Neural Networks Activation Function

Sinc function is a sinusoidal activation function in neural networks. In contrast to other common activation functions, it has rises … More

activation function, sine wave

ELU as a Neural Networks Activation Function

Recently a new activation function named Exponential Linear Unit or its widely known name ELU was introduced. Researchs reveal that … More

activation function

A Gentle Introduction to Cross-Entropy Loss Function

Neural networks produce multiple outputs in multi-class classification problems. However, they do not have ability to produce exact outputs, they … More

activation function, error function, loss function

Logarithm of Sigmoid As a Neural Networks Activation Function

Previously, we’ve reviewed sigmoid function as activation function for neural networks. Logarithm of sigmoid states it modified version. Unlike to … More

activation function, derivative, neural networks

Softsign as a Neural Networks Activation Function

Activation functions play pivotal role in neural networks. As an alternative to hyperbolic tangent, softsign is an activation function for … More

activation function, derivative, neural networks

Developers vs Mathematicians

Fermat’s last theorem has been waiting almost 350 years to be proven. Now, we can call it theorem but until 90s … More

developer, fermat, homer simpson

ReLU as Neural Networks Activation Function

Rectifier linear unit or its more widely known name as ReLU becomes popular for the past several years since its … More

activation function, neural networks

Posts navigation

Older posts
Newer posts


Licensed under a Creative Commons Attribution 4.0 International License.

Creative Commons License

You can use any content of this blog just to the extent that you cite or reference

Subscribe to sefiks.com to stay up-to-date with latest posts

🎭 DeepFace is THE BEST facial recognition library for Python 🐍 It wraps many cutting-edge models that reaches and passes human-level accuracy 💪


BTW, Haven't you subscribe to my YouTube channel yet?