LIONbook Chapter 8: Specific nonlinear models
The LIONbook on machine learning and optimization, written by co-founders of LionSolver software, is provided free on a chapter by chapter basis for personal and non-profit usage. Chapter 8 looks at nonlinear models, from logistic regression to LASSO.
Here is the latest chapter from LIONbook, a new book dedicated to "LION" combination of Machine Learning and Intelligent Optimization, written by the developers of LionSolver software, Roberto Battiti and Mauro Brunato.
This book will available for free from the web, chapter after chapter.
Here are previous chapters:
- Chapters 1-2: Introduction and nearest neighbors.
- Chapter 3: Learning requires a method
- Chapter 4: Linear models
- Chapter 5: Mastering generalized linear least-squares
- Chapter 6: Rules, decision trees, and forests
- Chapter 7: Ranking and selecting features
The latest chapter is
Chapter 8 Specific nonlinear models In this chapter we continue along our path from linear to nonlinear models. In order to avoid the vertigo caused by an abrupt introduction of the most general and powerful models, we start by a gradual modifications of the linear model, first to make it suitable for predicting probabilities (logistic regression), then by making the linear models local and giving more emphasis to the closest examples, in a kind of smoothed version of K-nearest neighbors (locally-weighted linear regression), finally by selecting a subset of inputs via appropriate constraints on the weights (LASSO).
After this preparatory phase, in the next chapters we will be ready to enter the holy of holies of flexible nonlinear models for arbitrary smooth input-output relationships like Multi-Layer Perceptrons (MLP) and Support Vector Machines (SVM).