Activation functions play pivotal role in neural networks. As an alternative to hyperbolic tangent, softsign is an activation function for … More

# Tag: activation function

# Softmax as a Neural Networks Activation Function

Convolutional neural networks popularize softmax so much as an activation function. However, softmax is not a traditional activation function. The … More

# ReLU as Neural Networks Activation Function

Rectifier linear unit or its more widely known name as ReLU becomes popular for the past several years since its … More

# Softplus as a Neural Networks Activation Function

Activation unit calculates the net output of a neural cell in neural networks. Backpropagation algorithm multiplies the derivative of the … More

# Step Function as a Neural Network Activation Function

Activation functions are decision making units of neural networks. They calculates net output of a neural node. Herein, heaviside step … More

# Hyperbolic Tangent as Neural Network Activation Function

In neural networks, as an alternative to sigmoid function, hyperbolic tangent function could be used as activation function. When you … More

# Sigmoid Function as Neural Network Activation Function

Sigmoid function (aka logistic function) is moslty picked up as activation function in neural networks. Because its derivative is easy to demonstrate. … More