Skip to content

  • Twitter

  • Youtube

  • GitHub

  • Linkedin

  • Facebook

  • Instagram

  • RSS

  • Mail

Sefik Ilkin Serengil

Code wins arguments

  • Home
  • About Me
  • Publications
  • Courses
  • Talks
  • Consultancy

Small is Beautiful: working with short URLs

Today’s world limits our expressions lenght to 140 character. No matter you text SMS or Tweet, you have to fit your…

bit.ly, bitlinks, bitly, blowfish, collision, goo.gl, google, hash, http get, http post, Java, url shortening

AI: a one-day wonder or an everlasting challenge

Debates between humans and computers start with mechanical turk. That’s an historical autonomous chess player costructed in 18th century. However,…

chess, game go, poker

Adaptive Learning in Neural Networks

Gradient descent is one of the most powerful optimizing method. However, learning time is a challange, too. Standard version of gradient…

adaptive learning rate, back propagation, gradient descent, neural networks

Incorporating Momentum Into Neural Networks Learning

Newton’s cradle is the most popular example of momentum conservation. A lifted and released sphere strikes the stationary spheres and…

gradient descent, momentum, neural networks

Even Superheroes Need to Rest: Working on Trained Neural Networks in Weka

Applying neural networks could be divided into two phases as learning and forecasting. Learning phase has high cost whereas forecasting…

Java, multilayer perceptron, weka

Building Neural Networks with Weka In Java

Building neural networks models and implementing learning consist of lots of math this might be boring. Herein, some tools help researchers to…

classification, Java, neural networks, regression, weka

Hyperbolic Tangent as Neural Network Activation Function

In neural networks, as an alternative to sigmoid function, hyperbolic tangent function could be used as activation function. When you…

activation function, neural networks

Backpropagation Implementation: Neural Networks Learning From Theory To Action

We’ve focused on the math behind neural networks learning and proof of the backpropagation algorithm. Let’s face it, mathematical background…

backpropagation, Java, neural networks, sinus wave

The Math Behind Neural Networks Learning with Backpropagation

Neural networks are one of the most powerful machine learning algorithm. However, its background might confuse brains because of complex…

backpropagation, neural networks, sigmoid

Sigmoid Function as Neural Network Activation Function

Sigmoid function (aka logistic function) is moslty picked up as activation function in neural networks. Because its derivative is easy to demonstrate.…

activation function, neural networks

Posts navigation

Older posts
Newer posts


Licensed under a Creative Commons Attribution 4.0 International License.

Creative Commons License

You can use any content of this blog just to the extent that you cite or reference

Subscribe to sefiks.com to stay up-to-date with latest posts