Today’s world limits our expressions lenght to 140 character. No matter you text SMS or Tweet, you have to fit your…
AI: a one-day wonder or an everlasting challenge
Debates between humans and computers start with mechanical turk. That’s an historical autonomous chess player costructed in 18th century. However,…
Adaptive Learning in Neural Networks
Gradient descent is one of the most powerful optimizing method. However, learning time is a challange, too. Standard version of gradient…
Incorporating Momentum Into Neural Networks Learning
Newton’s cradle is the most popular example of momentum conservation. A lifted and released sphere strikes the stationary spheres and…
Even Superheroes Need to Rest: Working on Trained Neural Networks in Weka
Applying neural networks could be divided into two phases as learning and forecasting. Learning phase has high cost whereas forecasting…
Building Neural Networks with Weka In Java
Building neural networks models and implementing learning consist of lots of math this might be boring. Herein, some tools help researchers to…
Hyperbolic Tangent as Neural Network Activation Function
In neural networks, as an alternative to sigmoid function, hyperbolic tangent function could be used as activation function. When you…
Backpropagation Implementation: Neural Networks Learning From Theory To Action
We’ve focused on the math behind neural networks learning and proof of the backpropagation algorithm. Let’s face it, mathematical background…
The Math Behind Neural Networks Learning with Backpropagation
Neural networks are one of the most powerful machine learning algorithm. However, its background might confuse brains because of complex…
Sigmoid Function as Neural Network Activation Function
Sigmoid function (aka logistic function) is moslty picked up as activation function in neural networks. Because its derivative is easy to demonstrate.…
