Even the worthy Homer sometimes nods. The idiom means even the most gifted person occasionally makes mistakes. We would adapt this … More
Category: Machine Learning
AI: a one-day wonder or an everlasting challenge
Debates between humans and computers start with mechanical turk. That’s an historical autonomous chess player costructed in 18th century. However, … More
Adaptive Learning in Neural Networks
Gradient descent is one of the most powerful optimizing method. However, learning time is a challange, too. Standard version of gradient … More
Incorporating Momentum Into Neural Networks Learning
Newton’s cradle is the most popular example of momentum conservation. A lifted and released sphere strikes the stationary spheres and … More
Even Superheroes Need to Rest: Working on Trained Neural Networks in Weka
Applying neural networks could be divided into two phases as learning and forecasting. Learning phase has high cost whereas forecasting … More
Building Neural Networks with Weka In Java
Building neural networks models and implementing learning consist of lots of math this might be boring. Herein, some tools help researchers to … More
Hyperbolic Tangent as Neural Network Activation Function
In neural networks, as an alternative to sigmoid function, hyperbolic tangent function could be used as activation function. When you … More
Backpropagation Implementation: Neural Networks Learning From Theory To Action
We’ve focused on the math behind neural networks learning and proof of the backpropagation algorithm. Let’s face it, mathematical background … More
The Math Behind Neural Networks Learning with Backpropagation
Neural networks are one of the most powerful machine learning algorithm. However, its background might confuse brains because of complex … More
Sigmoid Function as Neural Network Activation Function
Sigmoid function (aka logistic function) is moslty picked up as activation function in neural networks. Because its derivative is easy to demonstrate. … More