Skip to content

  • Twitter

  • Youtube

  • GitHub

  • Linkedin

  • Facebook

  • Instagram

  • RSS

  • Mail

Sefik Ilkin Serengil

Code wins arguments

  • Home
  • About Me
  • Publications
  • Courses
  • Talks
  • Consultancy

Author: Sefik Serengil

AI: a one-day wonder or an everlasting challenge

Debates between humans and computers start with mechanical turk. That’s an historical autonomous chess player costructed in 18th century. However, … More

chess, game go, poker

Adaptive Learning in Neural Networks

Gradient descent is one of the most powerful optimizing method. However, learning time is a challange, too. Standard version of gradient … More

adaptive learning rate, back propagation, gradient descent, neural networks

Incorporating Momentum Into Neural Networks Learning

Newton’s cradle is the most popular example of momentum conservation. A lifted and released sphere strikes the stationary spheres and … More

gradient descent, momentum, neural networks

Even Superheroes Need to Rest: Working on Trained Neural Networks in Weka

Applying neural networks could be divided into two phases as learning and forecasting. Learning phase has high cost whereas forecasting … More

Java, multilayer perceptron, weka

Building Neural Networks with Weka In Java

Building neural networks models and implementing learning consist of lots of math this might be boring. Herein, some tools help researchers to … More

classification, Java, neural networks, regression, weka

Hyperbolic Tangent as Neural Network Activation Function

In neural networks, as an alternative to sigmoid function, hyperbolic tangent function could be used as activation function. When you … More

activation function, neural networks

Backpropagation Implementation: Neural Networks Learning From Theory To Action

We’ve focused on the math behind neural networks learning and proof of the backpropagation algorithm. Let’s face it, mathematical background … More

backpropagation, Java, neural networks, sinus wave

The Math Behind Neural Networks Learning with Backpropagation

Neural networks are one of the most powerful machine learning algorithm. However, its background might confuse brains because of complex … More

backpropagation, neural networks, sigmoid

Sigmoid Function as Neural Network Activation Function

Sigmoid function (aka logistic function) is moslty picked up as activation function in neural networks. Because its derivative is easy to demonstrate. … More

activation function, neural networks

Introduction to Neural Networks: A Mechanism Taking Lessons From The Past

Neural Networks inspired from human central nervous system. They are based on making mistakes and learning lessons from past errors. They … More

classification, lion king, neural networks, regression

Posts navigation

Older posts
Newer posts


Licensed under a Creative Commons Attribution 4.0 International License.

Creative Commons License

You can use any content of this blog just to the extent that you cite or reference

Subscribe to sefiks.com to stay up-to-date with latest posts