Logarithm of Sigmoid As a Neural Networks Activation Function

Previously, we’ve reviewed sigmoid function as activation function for neural networks. Logarithm of sigmoid states it modified version. Unlike to sigmoid, log of sigmoid produces outputs in scale of (-∞, 0]. In this post, we’ll mention how to use the logarithmic sigmoid in feedforward and backpropagation in neural networks.

log-sigmoid-dance-v3
Natural log of sigmoid (Inspired from Imaginary)

Transfer Function

y = log(1/(1+e-x))


🙋‍♂️ You may consider to enroll my top-rated machine learning course on Udemy

Decision Trees for Machine Learning

Notice that log(x) denotes base-2 log in computer science, base-e log in mathematical analysis and  base-10 log in logarithm tables. Any log base can be refered to in this equation. That’s why, I would modify the equation to more generalized form.

y = logb(1/(1+e-x))

Logarithm base change rule states that logbx = logcx / logcb. We would apply the base change rule to the equation for the base-e (or natural logarithm).

y = logb(1/(1+e-x)) = loge(1/(1+e-x)) / loge(b) = ln(1/(1+e-x)) / ln(b)

Now, we would apply logarithm quotient rule to the dividend.

y = ln(1/(1+e-x)) / ln(b) = [ln(1) – ln(1+e-x)] / ln(b)

Notice that natural logarithm of 1 is equal to 0.

y = [ln(1) – ln(1+e-x)] / ln(b) = [0 – ln(1+e-x)] / ln(b) = – ln(1+e-x) / ln(b)





Derivative

Now, it is time to derive.

y = – ln(1+e-x) / ln(b)

dy/dx = d(- ln(1+e-x) / ln(b))/dx

We can move the constant terms ln(b) and -1 outside of the derivative.

dy/dx = (-1/ln(b)) . d(ln(1+e-x))/dx

Notice that derivative of ln(x) is equal to 1/x

dy/dx = (-1/ln(b)) . d(ln(1+e-x))/dx = (-1/ln(b)). (1/1+e-x). d(1+e-x)/dx

dy/dx = (-1/ln(b)) . (1/1+e-x). e-x . (-1) = (1/ln(b)).(1/1+e-x).e-x

dy/dx = (e-x)/(ln(b) . (1+e-x))

To simplify the derivative, we would multiply both dividend and divisor to ex





dy/dx = (e-x)/(ln(b) . (1+e-x)) = (e-x.ex)/(ln(b).(1+e-x).ex) = 1 / (ln(b).(ex+1))

To sum up, activation function and derivative for logarithm of sigmoid is demonstrated below.

y = logb(1/(1+e-x))

dy/dx = 1 / (ln(b).(ex+1))

Natural Logarithm of Sigmoid

We’ve produced generalized form for derivative of logarithm of sigmoid. We would change b to e to calculate the derivative of natural logarithm of sigmoid. Then, derivative would be in simpler form.

y = loge(1/(1+e-x)) = ln(1/(1+e-x)

dy/dx = 1 / (ln(e).(ex+1))

Notice that logarithm of the base is equal to 1. Similarly ln(e) is equal to 1, too.

dy/dx = 1 / (ln(e).(ex+1)) = 1 / (ex+1)

To sum up, activation function and derivative for natural logarithm of sigmoid is illustrated below.





y = ln(1/(1+e-x)

dy/dx = 1 / (ex+1)

log-of-sigmoid
Logarithm of sigmoid and its derivative

Let’s dance

These are the dance moves of the most common activation functions in deep learning. Ensure to turn the volume up 🙂


Support this blog if you do like!

Buy me a coffee      Buy me a coffee


2 Comments

Comments are closed.