What's New in this Edition



This major new edition features many topics not covered in the original, including graphical models, random forests, ensemble methods, least angle regression & path algorithms for the lasso, non-negative matrix factorization and spectral clustering. There is also a chapter on methods for ``wide'' data (p bigger than n), including multiple testing and false discovery rates.

Here is a detailed list of the changes in this edition:

Chapter What's new
1. Introduction
2. Overview of Supervised Learning
3. Linear Methods for Regression LAR algorithm and generalizations of the lasso
4. Linear Methods for Classification Lasso path for logistic regression
5. Basis Expansions and Regularization Additional illustrations of RKHS
6. Kernel Smoothing Methods
7. Model Assessment and Selection Strengths and pitfalls of cross-validation
8. Model Inference and Averaging
9. Additive Models, Trees, and
Related Methods
10. Boosting and Additive Trees New example from ecology; some material split off to Chapter 16.
11. Neural Networks Bayesian neural nets and the NIPS 2003 challenge
12. Support Vector Machines and Flexible Discriminants Path algorithm for SVM classifier
13. Prototype Methods and Nearest-Neighbors
14. Unsupervised Learning Spectral clustering, kernel PCA, sparse PCA, non-negative matrix factorization archetypal analysis, nonlinear dimension reduction, Google page rank algorithm, a direct approach to ICA
15. Random Forests New
16. Ensemble Learning New
17. Undirected Graphical Models New
18. High-Dimensional Problems New

Some further notes: