Today I learned again that most things in life are a matter of semantics… After some online lectures in Machine Learning techniques I discovered that what I call “Ordinary Least Squares” is generalized as a “cost function” and a simplified version of the “Newton Method” is refferred to as “Gradient Descent”.

So, basically, the core of a supervised learned algorithm seems to be the choose of an appropriate “cost function” and the application of the most effective minimization algorithm.