Evolution of Neural Networks

Today, AI lives its golden age whereas neural networks make a great contribution to it. Neural networks change our lifes without even realizing it. It lies behind the image, face and speech recognition, also language translation, even in future predictions. However, it is not coming to the present form in a day. Let’s travel to the past and monitor its previous forms.


🙋‍♂️ You may consider to enroll my top-rated machine learning course on Udemy

Decision Trees for Machine Learning

nn_timeline
Brief history of neural networks

1950s

Perceptron idea is produced in this decade. It includes updating weights, deciding and reacting based on the threshold. In other words, learning would be handled in this form of historical neural network for the first time. In those days, common logic functions such as AND, OR and NOT can be solved by the invention. Thus, people believe that they live AI golden age. But that is not true.

1970s

In this decade, it is revealed that perceptron thing cannot learn the XOR function. The fact is that people have been believing perceptrons would not learn XOR but there is no proof for that. In other words, people do not notice this subject after 20 years.

This is very similar to Fermat’s Last Theorem. It denotes that there is no integer n satisfies the equation an + bn = cn while n > 2. People cannot find any n satisfies this equation but it was statement and it was not proved for long years. There might be very large n satisfying the equation. Fermat’s last statement waits 350 years to be proven and to be transformed to theorem.

In this era, it is proven that it is impossible to solve this logic function with perceptrons. Thus, people realized that they still live dark age of AI. Previous applications such as AND and OR logic functions are linear problems whereas XOR function is non-linear problem. In other words, function results cannot be seperated by a single linear line in two dimensional graph.

XOR in real life

BTW, XOR logic function is the hello world application of machine learning studies. It is not just a theoretical concept. We often apply it in daily life, too. Consider your reaction for receiving salary after a month’s work.

You would not object if you hadn’t work and you hadn’t receive your salary. Similarly you would not object if you had worked and had received your salary. These are similar to logic false results for xor truth table.

On the other hand, you should object if you had worked but you hadn’t receive a salary. Vice versa, you should ethically object if you hadn’t work but you had received a salary. These are similar to logic true results in xor truth table.

xorsketch
XOR logic gate

1980s

In this decade, it is understood that adding layers between inputs and outputs in perceptrons can handle non-linear problems. These layers are also called as hidden layers and this type of perceptron is called as multilayer perceptron. Moreover, usage of differentiable non linear activation functions such as sigmoid instead of threshold function enables the learning because learning algorithm will consume function derivatives.





In this way, the real golden age started. Creator of this idea is Geoffrey Hinton. We will come up again him later! Keep his name in your mind.

Additionaly, in theory, any function can be learned if the approach can solve XOR logic function. Even though, xor logic function were a challenge for 70s, it still would be letter of A of the alphabet for machine learning studies.

1990s

Neural networks did go beyond time! Even though, humanity has the cure, they do not have the enough computation power in those days. But humans are vanity driven beings. They blame the neural networks instead of the computation power they’ve had. This causes neural networks to hibernate. Researchers canalised to invent new algorithm which is support vector machine (SVM).

2000s

In this decade, Geoffrey Hinton gets on the stage again! He transforms the concept of neural networks to deep learning which includes too many hidden layers. Rising of the computation power enables the motivation to be adopted in academy and sector. But still it does not become widespread in personal usage.

2010s

In this decade, graphic processing units (GPU) become widespread in personal use. Processing big data in parallelly is enabled. Moreover, cloud services provide to consume these things without investing any hardware. That would be the breaking point. This reveals that deep learning defeats older learning algorithms.

Today, deep learning stands in back of most of challenging technologies such as speech recognition, image recognition, language translation. Especially, think about that the developments in language translation in last years.

So, neural networks even passed through the period of stagnation, after all it would be the winner among other machine learning algorithms. And history writes Geoffrey Hinton‘s name in letters of gold. He twice changes the destiny of science. Nowadays, he announced capsule networks and triggered the third jump in AI world. We are grateful to him.

GHinton04
Geoffrey Hinton, Godfather of Neural Networks and Deep Learning


Like this blog? Support me on Patreon

Buy me a coffee