Robustifying AdaBoost by Adding the Naive Error Rate

Abstract

AdaBoost can be derived by sequential minimization of the exponential loss function. It implements the learning process by exponentially reweighting examples according to classification results. However, weights are often too sharply tuned, so that AdaBoost suffers from the nonrobustness and overlearning. We propose a new boosting method that is a slight modification of AdaBoost. The loss function is defined by a mixture of the exponential loss and naive error loss functions. As a result, the proposed method incorporates the effect of forgetfulness into AdaBoost. The statistical significance of our method is discussed, and simulations are presented for confirmation.

Cite

Text

Takenouchi and Eguchi. "Robustifying AdaBoost by Adding the Naive Error Rate." Neural Computation, 2004. doi:10.1162/089976604322860695

Markdown

[Takenouchi and Eguchi. "Robustifying AdaBoost by Adding the Naive Error Rate." Neural Computation, 2004.](https://mlanthology.org/neco/2004/takenouchi2004neco-robustifying/) doi:10.1162/089976604322860695

BibTeX

@article{takenouchi2004neco-robustifying,
  title     = {{Robustifying AdaBoost by Adding the Naive Error Rate}},
  author    = {Takenouchi, Takashi and Eguchi, Shinto},
  journal   = {Neural Computation},
  year      = {2004},
  pages     = {767-787},
  doi       = {10.1162/089976604322860695},
  volume    = {16},
  url       = {https://mlanthology.org/neco/2004/takenouchi2004neco-robustifying/}
}