LIONbook Chapter 11: Democracy in machine learning – combining models
The LIONbook on machine learning and optimization, written by co-founders of LionSolver software, is provided free for personal and non-profit usage. Chapter 11 looks at Democracy in machine learning - how to combine different models in flexible, creative and effective ways.
Here is the latest chapter from LIONbook, a new book dedicated to "LION" combination of Machine Learning and Intelligent Optimization, written by the developers of LionSolver software, Roberto Battiti and Mauro Brunato.
This book is freely available on the web.
Here are the previous chapters:
- Chapters 1-2: Introduction and nearest neighbors.
- Chapter 3: Learning requires a method
- Chapter 4: Linear models
- Chapter 5: Mastering generalized linear least-squares
- Chapter 6: Rules, decision trees, and forests
- Chapter 7: Ranking and selecting features
- Chapter 8: Specific nonlinear models
- Chapter 9: Neural networks, shallow and deep
- Chapter 10: Statistical Learning Theory and Support Vector Machines (SVM).
You can also download the entire book here.
The latest chapter is Chapter 11: Democracy in machine learning.
This is the final chapter in the supervised learning part. As you discovered, there are many competing techniques for solving the problem, and each technique is characterized by choices and meta-parameters: when this flexibility is taken into account, one easily ends up with a very large number of possible models for a given task.
When confronted with this abundance one may just select the best model (and best meta-parameters) and throw away everything else, or recognize that there's never too much of a good thing and try to use all of them, or at least the best ones. One already spends effort and CPU time to select the best model and meta-parameters, producing many models as a byproduct. Are there sensible ways to recycle them so that the effort is not wasted? Relax, this chapter does not introduce radically new models but deals with using many different models in flexible, creative and effective ways. The advantage is in some cases so clear that using many models will make a difference between winning and losing a competition in ML.