Sinc as a Neural Networks Activation Function

Sinc function is a sinusoidal activation function in neural networks. In contrast to other common activation functions, it has rises and falls. However, the function saturated and its output converges to zero for large positive and negative inputs.

sinc-move1
Sinc function dance move (Imaginary)

The definition of the function is sine x over x. Dividing something to zero makes the equation undefined. So, the function would be undefined for x is equal to zero. That’s why, there is an exception point for function definition where x is equal to zero. sinc(0) is defined as value of 1 as an exception.


🙋‍♂️ You may consider to enroll my top-rated machine learning course on Udemy

Decision Trees for Machine Learning

f(x) = sin(x)/x for x≠0

f(x) = 1 for x=0

So, the function is illustrated below. As seen, its output closes to zero when x increased positively or negatively.

sinc-function
Sinc function

Actually, the function is similar to cosine function. Its output decreases in ratio of the distance from the origin.

sinc-vs-cos
Sinc(x) vs Cos(x)

Funnily, name of the function comes from cardinal sine. The widest area of the graph resembles the cardinal with a hat.

13-09-12-Sinus-cardinal
Sinc name comes from where

Derivative

An activation function can be useful if it is differentiable. Its derivative would be involved in backpropagation while neural networks learn.

As mentioned before, the function has an exception point for x = 0. Derivative of constant value is equal to zero. We need to find the derivative for x ≠ 0.

y = sin(x)/x





Quotient rule says that two differantiable function can be expressed as the following form.

(f/g)’ = (f’g – fg’) / g2

Apply the rule to sinc function.

f'(x) = (sin(x)/x)’ = (sin'(x).x – sin(x).x’)/x2 = (cos(x).x – sin(x).1)/x2

We can express the derivative in simpler form

f'(x) = cos(x)/x – sin(x)/x2

y = sin(x)/x for x ≠ 0

y = 1 for x = 0

dy/dx = cos(x)/x – sin(x)/x2 , if x ≠ 0

dy/dx = 0, if x = 0





So, periodic functions such as sine or cosine are not popular in neural networks as transfer function. Even though sinc is periodic, it would be saturated when input increases positively or negatively just like other common activation functions such as sigmoid or tanh. That’s why, cardinal sine is powerful alternative for activation unit in neural networks.

DBP_1984_1219_Friedrich_Wilhelm_Bessel
Friedrich Wilhelm Bessel, Creator of sinc function

Let’s dance

These are the dance moves of the most common activation functions in deep learning. Ensure to turn the volume up 🙂


Like this blog? Support me on Patreon

Buy me a coffee


1 Comment

Comments are closed.