The hundred-page Machine Learning Book – Andriy Burkov

Let’s start by telling the truth: machines don’t learn. What a typical “learning machine” does, is finding a mathematical formula, which, when applied to a collection of inputs (called “training data”), produces the desired outputs. This mathematical formula also generates the correct outputs for most other inputs (distinct from the training data) on the condition that those inputs come from the same or a similar statistical distribution as the one the training data was drawn from. Why isn’t that learning? Because if you slightly distort the inputs, the output is very likely to become completely wrong. It’s not how learning in animals works. If you learned to play a video game by looking straight at the screen, you would still be a good player if someone rotates the screen slightly. A machine learning algorithm, if it was trained by “looking” straight at the screen, unless it was also trained to recognize rotation, will fail to play the game on a rotated screen. So why the name “machine learning” then? The reason, as is often the case, is marketing: Arthur Samuel, an American pioneer in the field of computer gaming and artificial intelligence, coined the term in 1959 while at IBM. Similarly to how in the 2010s IBM tried to market the term “cognitive computing” to stand out from competition, in the 1960s, IBM used the new cool term “machine learning” to attract both clients and talented employees. As you can see, just like artificial intelligence is not intelligence, machine learning is not learning. However, machine learning is a universally recognized term that usually refers to the science and engineering of building machines capable of doing various useful things without being explicitly programmed to do so. So, the word “learning” in the term is used by analogy with the learning in animals rather than literally.

Related posts:

Medical Image Segmentation Using Artificial Neural Networks
Deep Learning for Natural Language Processing - Jason Brownlee
Hands-on Machine Learning with Scikit-Learn, Keras & TensorFlow - Aurelien Geron
Introducing Data Science - Davy Cielen & Arno D.B.Meysman & Mohamed Ali
Machine Learning with Python for everyone - Mark E.Fenner
Learning scikit-learn Machine Learning in Python - Raul Garreta & Guillermo Moncecchi
Python Machine Learning Eqution Reference - Sebastian Raschka
Introduction to Deep Learning - Eugene Charniak
Neural Networks and Deep Learning - Charu C.Aggarwal
Deep Learning and Neural Networks - Jeff Heaton
Deep Learning with Python - A Hands-on Introduction - Nikhil Ketkar
Deep Learning with PyTorch - Vishnu Subramanian
Pattern recognition and machine learning - Christopher M.Bishop
Python Machine Learning - Sebastian Raschka
Machine Learning Mastery with Python - Understand your data, create accurate models and work project...
Deep Learning for Natural Language Processing - Palash Goyal & Sumit Pandey & Karan Jain
Data Science and Big Data Analytics - EMC Education Services
Artificial Intelligence with an introduction to Machine Learning second edition - Richar E. Neapolit...
Python 3 for Absolute Beginners - Tim Hall & J.P Stacey
Statistical Methods for Machine Learning - Disconver how to Transform data into Knowledge with Pytho...
Deep Learning with Theano - Christopher Bourez
Deep Learning in Python - LazyProgrammer
Python for Programmers with introductory AI case studies - Paul Deitel & Harvey Deitel
Superintelligence - Paths, Danges, Strategies - Nick Bostrom
Generative Deep Learning - Teaching Machines to Paint, Write, Compose and Play - David Foster
Python Data Analytics with Pandas, NumPy and Matplotlib - Fabio Nelli
Intelligent Projects Using Python - Santanu Pattanayak
Learn Keras for Deep Neural Networks - Jojo Moolayil
Python Deeper Insights into Machine Learning - Sebastian Raschka & David Julian & John Hearty
Applied Text Analysis with Python - Benjamin Benfort & Rebecca Bibro & Tony Ojeda
Python Machine Learning Cookbook - Practical solutions from preprocessing to Deep Learning - Chris A...
Pro Deep Learning with TensorFlow - Santunu Pattanayak