Tag Archives: classification

Becoming the MacGyver of Machine Learning with Neural Networks

You would most probably remember MacGyver if you are a member of generation Y. He is famous for creating materials around him to solve unordinary solutions he faced with. Swiss army knife and duct tape would most probably be used in his practical solution. So, neural networks would be your swiss army knife in machine learning studies.


Richard Dean Anderson appears in series as MacGyver

Previous experiments determine machine learning study to be handled as supervised or unsupervised.

Segmentation is a type of unsupervised learning. In this field, related group of an instance would be looked for. For example, a gym can group customers as fat and thin. However, segmentation method can be based on customer weights, body mass index or muscle and body fat ratio. In other words, there is no correct way for solution. A customer can be involved in different segments in different studies.

In contrast, labels for instances are exact in supervised learning. Suppose that you are working on dead loans. Outstanding ones of given loans are already known.

Continue reading

Homer Sometimes Nods: Error Metrics in Machine Learning

Even the worthy Homer sometimes nods. The idiom means even the most gifted person occasionally makes mistakes. We would adapt this sentence to machine learning lifecycle. Even the best ML-models should make mistakes (or else overfitting problem). The important thing is know how to measeure errors. There are lots of metrics for measuring forecasts. In this post, we will mention evalution metrics meaningful for ML studies.


Homer Simpson uses catchphrase D’oh! when he has done something wrong

Sign of actual and predicted value diffence should not be considered when calculation total error of a system. Otherwise, total error of a series including equally high underestimations and overestimations might measure very low error. In fact, forecasts should include low underestimations and overestimations, and total error should be measured low. Discarding sign values provides to get rid of this negative effect. Squaring differences enables discarding signs. This metric is called as Mean Squared Error or mostly MSE.

Continue reading

Building Neural Networks with Weka In Java

Building neural networks models and implementing learning consist of lots of math this might be boring. Herein, some tools help researchers to build network easily. Thus, a researcher who knows the basic concept of neural networks can build a model without applying any math formula.

So, Weka is one of the most common machine learning tool for machine learning studies. It is a java-based API developed by Waikato University, New Zealand. Weka is an acronym for Waikato Environment for Knowledge Analysis. Actually, name of the tool is a funny word play because weka is a bird species endemic to New Zealand. Thus, researchers can introduce an endemic bird to world wide.

Continue reading

Introduction to Neural Networks: A Mechanism Taking Lessons From The Past

Neural Networks inspired from human central nervous system. They are based on making mistakes and learning lessons from past errors. They produces results very fast like human reflexes in contrast to learning which lasts continous.

For instance, a kid who never touch a hot surface would decide to touch it. However, he would not touch hot surfaces anymore if his hand is burnt once. Adapting this example to the neural networks discipline makes the concept easier to figure out.

Continue reading