I am interested in machine learning and deep learning and this book gave an overview of the field for me.
Machine learning can be categorized into 5 groups, symbolists, connectionists, evolutionaries, bayesians, and analogizers.
All are looking to ways to simulate intelligence in machines by learning.
Symbolists are the pure math type, using logic to solve intelligence. Their weapon is the reverse deduction or induction. Linear regression and gradient descent are under this category.
Connectionists get their inspiration from the brain, namely neurons. Their weapon is the neural networks. Deep learning is under this category.
Evolutionaries tries to emulate evolution. Since mother nature can produce intelligence, we simulate the way she did it. Genetic algorithm is under this category.
Bayesians use probability. You attach probabilities to event, and the event is more probably if it occurs many times. Bayesian network and Markov network are under this category.
Analogizers use similarity function. Things that are similar tends to be alike. The famous nearest neighbour algorithm and Support Vector Machine (SVM) are under this category.
There are ways to combine methods of these 5 categories, namely stacking, bagging (e.g. random forest), boosting.
Also, the master algorithm unifying these 5 categories.