Convolutional neural networks make ReLU activation function so popular. Common alternatives such as sigmoid or tanh have upper limits to … More

# Tag: activation function

# Sinc as a Neural Networks Activation Function

Sinc function is a sinusoidal activation function in neural networks. In contrast to other common activation functions, it has rises … More

# ELU as a Neural Networks Activation Function

Recently a new activation function named Exponential Linear Unit or its widely known name ELU was introduced. Researchs reveal that … More

# A Gentle Introduction to Cross-Entropy Loss Function

Neural networks produce multiple outputs in multiclass classification problems. However, they do not have ability to produce exact outputs, they … More

# Logarithm of Sigmoid As a Neural Networks Activation Function

Previously, we’ve reviewed sigmoid function as activation function for neural networks. Logarithm of sigmoid states it modified version. Unlike to … More

# Softsign as a Neural Networks Activation Function

Activation functions play pivotal role in neural networks. As an alternative to hyperbolic tangent, softsign is an activation function for … More

# Softmax as a Neural Networks Activation Function

In fact, convolutional neural networks popularize softmax so much as an activation function. However, softmax is not a traditional activation function. … More

# ReLU as Neural Networks Activation Function

Rectifier linear unit or its more widely known name as ReLU becomes popular for the past several years since its … More

# Softplus as a Neural Networks Activation Function

Activation unit calculates the net output of a neural cell in neural networks. Backpropagation algorithm multiplies the derivative of the … More

# Step Function as a Neural Network Activation Function

Activation functions are decision making units of neural networks. They calculates net output of a neural node. Herein, heaviside step … More