Boosting with Noisy Data: Some Views from Statistical Theory

Abstract

This letter is a comprehensive account of some recent findings about AdaBoost in the presence of noisy data when approached from the perspective of statistical theory. We start from the basic assumption of weak hypotheses used in AdaBoost and study its validity and implications on generalization error. We recommend studying the generalization error and comparing it to the optimal Bayes error when data are noisy. Analytic examples are provided to show that running the unmodified AdaBoost forever will lead to overfit. On the other hand, there exist regularized versions of AdaBoost that are consistent, in the sense that the resulting prediction will approximately attain the optimal performance in the limit of large training samples.

Cite

Text

Jiang. "Boosting with Noisy Data: Some Views from Statistical Theory." Neural Computation, 2004. doi:10.1162/089976604322860703

Markdown

[Jiang. "Boosting with Noisy Data: Some Views from Statistical Theory." Neural Computation, 2004.](https://mlanthology.org/neco/2004/jiang2004neco-boosting/) doi:10.1162/089976604322860703

BibTeX

@article{jiang2004neco-boosting,
  title     = {{Boosting with Noisy Data: Some Views from Statistical Theory}},
  author    = {Jiang, Wenxin},
  journal   = {Neural Computation},
  year      = {2004},
  pages     = {789-810},
  doi       = {10.1162/089976604322860703},
  volume    = {16},
  url       = {https://mlanthology.org/neco/2004/jiang2004neco-boosting/}
}