LIONbook Chapter 16: Visualizing Graphs and Networks
The LIONbook on machine learning and optimization, written by cofounders of LionSolver software, is provided free for personal and nonprofit usage. Chapter 16 looks at Visualizing graphs and networks by nonlinear maps.
Here is the latest chapter from LIONbook, a new book dedicated to "LION" combination of Machine Learning and Intelligent Optimization, written by the developers of LionSolver software, Roberto Battiti and Mauro Brunato.
This book is freely available on the web.
Here are the previous chapters:
 Chapters 12: Introduction and nearest neighbors.
 Chapter 3: Learning requires a method
 Chapter 4: Linear models
 Chapter 5: Mastering generalized linear leastsquares
 Chapter 6: Rules, decision trees, and forests
 Chapter 7: Ranking and selecting features
 Chapter 8: Specific nonlinear models
 Chapter 9: Neural networks, shallow and deep
 Chapter 10: Statistical Learning Theory and Support Vector Machines (SVM).
 Chapter 11: Democracy in machine learning: how to combine different methods.
 Chapter 12: Topdown clustering: Kmeans.
 Chapter 13: Bottomup (agglomerative) clustering.
 Chapter 14: Selforganizing maps.
 Chapter 15: Dimensionality reduction by linear transformations (projections).
You can also download the entire book here.
The latest chapter is Chapter 16: Visualizing Graphs and Networks.
After considering visualizations based on linear projections in Chapter 15, let's now consider more general ways to feed the human unsupervised learning capabilities and derive insight from data. We assume that the n entities to be displayed are not necessarily characterized by internal coordinates, but only by itemtoitem (i.e., external) relationships such as dissimilarities between two items i and j denoted by d_ij. If items do have coordinates, such external relationships can be obtained by simple ways, as explained in Sec. 12.2.
... Given the way in which our eyes work, visualization is only in two or three dimensions. Our aim is therefore to place items on the 2D plane (or in 3D space) so that their mutual distances are as close as possible to their dissimilarities. In general, a perfect placement obeying to all dissimilarities is impossible; therefore, a precise criterion is needed in order to define what placements can be considered as acceptable.
The problem is therefore the following: given a set of (positive) dissimilarities d_ij between items, find the twodimensional or threedimensional coordinates p_i for all items that provide a convenient placement of such items on the plane, so that dissimilarities are maintained as much as possible. The simplest approach is stress minimization, stress being the amount by which a visualized dissimilarity is compressed or expanded with respect to the original dissimilarity. It is intuitive, physical, and useful also as a starting point to understand more complex approaches.
Top Stories Past 30 Days

