Fermat’s last theorem has been waiting almost 350 years to be proven. Now, we can call it theorem but until 90s … More

# Category: Math

# Softplus as a Neural Networks Activation Function

Activation unit calculates the net output of a neural cell in neural networks. Backpropagation algorithm multiplies the derivative of the … More

# Step Function as a Neural Network Activation Function

Activation functions are decision making units of neural networks. They calculates net output of a neural node. Herein, heaviside step … More

# Hyperbolic Tangent as Neural Network Activation Function

In neural networks, as an alternative to sigmoid function, hyperbolic tangent function could be used as activation function. When you … More

# The Math Behind Neural Networks Learning with Backpropagation

Neural networks are one of the most powerful machine learning algorithm. However, its background might confuse brains because of complex … More

# Sigmoid Function as Neural Network Activation Function

Sigmoid function (aka logistic function) is moslty picked up as activation function in neural networks. Because its derivative is easy to demonstrate. … More

# The Math Behind Elliptic Curves over Binary Field

In the previous post, we’ve mention the math behind addition law for elliptic curves over Galois Field GF(p) – prime … More

# The Math Behind Elliptic Curve Cryptography

The most of cryptography resources mention elliptic curve cryptography, but they often ignore the math behind elliptic curve cryptography and … More