KDnuggets : Newsletter : 1999 Issues : 99:05 Contents :

KDnuggets 99:05, item 2, Publications:

Previous | Contents |  Next

Date: Wed, 24 Feb 1999 13:17:07 +1100
From: Jerry.Friedman@cmis.CSIRO.AU
Subject: Preprint: boosting methods for regression and classification.
                    Greedy Function Approximation:
                     A Gradient Boosting Machine

                         Jerome H. Friedman
                         Stanford University

                              ABSTRACT
Function approximation is viewed from the perspective of numerical
optimization in function space, rather than parameter space. A
connection is made between stagewise additive expansions and
steepest-descent minimization. A general gradient-descent "boosting"
paradigm is developed for additive expansions based on any fitting
criterion. Specific algorithms are presented for least-squares,
least-absolute-deviation, and Huber-M loss functions for regression,
and multi-class logistic likelihood for classification. Special
enhancements are derived for the particular case where the individual
additive components are decision trees, and tools for interpreting
such "TreeBoost" models are presented. Gradient boosting of decision
trees produces competitive, highly robust, interpretable procedures
for regression and classification, especially appropriate for mining
less than clean data. Connections between this approach and the
boosting methods of Freund and Shapire 1996, and Friedman, Hastie, and
Tibshirani 1998 are discussed.

Available from: http://www-stat.stanford.edu/~jhf/ftp/trebst.ps

Previous | Contents |  Next


KDnuggets : Newsletter : 1999 Issues : 99:05 Contents :

Copyright © 1999 KDnuggets