I read a lot of misconceptions this morning related to this article regarding Google Translate. Is not properly fresh news but this morning in my telegram group @scienza, this other popularization article has been posted that completely misunderstood the premises of the original academic article (also the so-called informed comments are not really , so I decided to try to keep the record straight and offer a question. In the article the approach is referred as a multitasking learning framework
Today I learned again that most things in life are a matter of semantics… After some online lectures in Machine Learning techniques I discovered that what I call “Ordinary Least Squares” is generalized as a “cost function” and a simplified version of the “Newton Method” is refferred to as “Gradient Descent”. So, basically, the core of a supervised learned algorithm seems to be the choose of an appropriate “cost function” and the application of the most effective minimization algorithm.