Newton’s cradle is the most popular example of momentum conservation. A lifted and released sphere strikes the stationary spheres and … More
Author: Sefik Serengil
Even Superheroes Need to Rest: Working on Trained Neural Networks in Weka
Applying neural networks could be divided into two phases as learning and forecasting. Learning phase has high cost whereas forecasting … More
Building Neural Networks with Weka In Java
Building neural networks models and implementing learning consist of lots of math this might be boring. Herein, some tools help researchers to … More
Hyperbolic Tangent as Neural Network Activation Function
In neural networks, as an alternative to sigmoid function, hyperbolic tangent function could be used as activation function. When you … More
Backpropagation Implementation: Neural Networks Learning From Theory To Action
We’ve focused on the math behind neural networks learning and proof of the backpropagation algorithm. Let’s face it, mathematical background … More
The Math Behind Neural Networks Learning with Backpropagation
Neural networks are one of the most powerful machine learning algorithm. However, its background might confuse brains because of complex … More
Sigmoid Function as Neural Network Activation Function
Sigmoid function (aka logistic function) is moslty picked up as activation function in neural networks. Because its derivative is easy to demonstrate. … More
Introduction to Neural Networks: A Mechanism Taking Lessons From The Past
Neural Networks inspired from human central nervous system. They are based on making mistakes and learning lessons from past errors. They … More
Exponential Smoothing: A Forecasting Approach Smoke Pleasure Triggered
Smoothing methods basically generalize the time series functions based on previous examples’ seasonal effects and trends. In this way, these methods … More
Image is everything
You might remember the advertisements of Sprite in 90’s. The brand has a motto; image is nothing, thirst is everything, … More
