Boosting Regression Estimators
Abstract
There is interest in extending the boosting algorithm (Schapire, 1990) to fit a wide range of regression problems. The threshold-based boosting algorithm for regression used an analogy between classification errors and big errors in regression. We focus on the practical aspects of this algorithm and compare it to other attempts to extend boosting to regression. The practical capabilities of this model are demonstrated on the laser data from the Santa Fe times-series competition and the Mackey-Glass time series, where the results surpass those of standard ensemble average.
Cite
Text
Avnimelech and Intrator. "Boosting Regression Estimators." Neural Computation, 1999. doi:10.1162/089976699300016746Markdown
[Avnimelech and Intrator. "Boosting Regression Estimators." Neural Computation, 1999.](https://mlanthology.org/neco/1999/avnimelech1999neco-boosting/) doi:10.1162/089976699300016746BibTeX
@article{avnimelech1999neco-boosting,
title = {{Boosting Regression Estimators}},
author = {Avnimelech, Ran and Intrator, Nathan},
journal = {Neural Computation},
year = {1999},
pages = {499-520},
doi = {10.1162/089976699300016746},
volume = {11},
url = {https://mlanthology.org/neco/1999/avnimelech1999neco-boosting/}
}