Category Archives: Math

Step Function as a Neural Network Activation Function

Activation functions are decision making units of neural networks. They calculates net output of a neural node. Herein, heaviside step function is one of the most common activation function in neural networks. The function produces binary output. That is the reason why it also called as binary step function. The function produces 1 (or true) when input passes threshold limit whereas it produces 0 (or false) when input does not pass threshold. That’s why, they are very useful for binary classification studies.

step_function_dance_move-v2

Heaviside Step Function Dance Move

Human reflexes act based on the same principle. A person will withdraw his hand when he touces on a hot surface. Because his sensory neuron detects high temperature and fires. Passing threshold triggers to respond and withdrawal reflex action is taken. You might think true output causing fire action.

Continue reading

Hyperbolic Tangent as Neural Network Activation Function

In neural networks, as an alternative to sigmoid function, hyperbolic tangent function could be used as activation function. When you backpropage, derivative of activation function would be involved in calculation for error effects on weights. Derivative of hyperbolic tangent function has a simple form just like sigmoid function. This explains why hyperbolic tangent common in neural networks.

tanh_dance

Hyperbolic Tangent Dance Move (Imaginary)

Hyperbolic Tangent Function: tanh(x) = (ex – e-x) / (ex + e-x)

Continue reading

The Math Behind Neural Networks Learning with Backpropagation

Neural networks are one of the most powerful machine learning algorithm. However, its background might confuse brains because of complex mathematical calculations. In this post, math behind the neural network learning algorithm and state of the art are mentioned.

j-cvmldm

Backpropagation is very common algorithm to implement neural network learning. The algorithm is basically includes following steps for all historical instances. Firstly, feeding forward propagation is applied (left-to-right) to compute network output. That’s the forecast value whereas actual value is already known. Secondly, difference of the forecast and actual value is calculated and it is called as error. Thirdly, error is reflected to the all the weighs and weights are updated based on calculated error. Finally, these procedures are applied until custom epoch count (e.g. epoch=1000).

Continue reading

Sigmoid Function as Neural Network Activation Function

Sigmoid function (aka logistic function) is moslty picked up as activation function in neural networks. Because its derivative is easy to demonstrate. It produces output in scale of [0 ,1] whereas input is meaningful between [-5, +5]. Out of this range produces same outputs. In this post, we’ll mention the proof of the derivative calculation.

sigmoid_curve_icon

Sigmoid function is formulized in the following form:

f(x) = 1 / (1 + e-x)

Continue reading

The Math Behind Elliptic Curves over Binary Field

In the previous post, we’ve mention the math behind addition law for elliptic curves over Galois Field GF(p) – prime field. Now, math behind elliptic curves over Galois Field GF(2n) – binary field would be mentioned. In literature, elliptic curves over GF(2n) are more common than GF(p) because of their adaptability into the computer hardware implementations.

ell_strikes_back_2

Elliptic Curve Binary Form

Elliptic Curves over GF(2n)

Algebraically, an elliptic curve over binary field is represented as the following form:

y2 + xy = x3 + ax2 + b, (b≠0)

Negative Point

Suppose that P(x, y) is a point on the curve. The negative of the point P(x, y) is -P(x, -(x+y)), and -P is still on the curve.

Continue reading

The Math Behind Elliptic Curve Cryptography

The most of cryptography resources mention elliptic curve cryptography, but they often ignore the math behind elliptic curve cryptography and directly start with the addition formula. This approach could be very confusing for beginners. In this post, proven of the addition formula would be illustrated for Elliptic Curves over Galois Field GF(p) – prime field.

ell_strikes_back_1 (1)

An Illustration from cr.yp.to

Elliptic Curves over GF(p)

Basically, an Elliptic Curve is represented as an equation of the following form.

y2 = x3 + ax + b (Weierstrass Equation)

Pre-condition: 4a3 + 27b2 ≠ 0 (To have 3 distinct roots)

Addition of two points on an elliptic curve would be a point on the curve, too. Adding two points on an elliptic curve is demonstrated on the following illustration.

P(x1, y1) + Q(x2, y2) = R(x3, y3)

Continue reading