Precision-Based Boosting

Abstract

AdaBoost is a highly popular ensemble classification method for which many variants have been published. This paper proposes a generic refinement of all of these AdaBoost variants. Instead of assigning weights based on the total error of the base classifiers (as in AdaBoost), our method uses class-specific error rates. On instance x it assigns a higher weight to a classifier predicting label y on x, if that classifier is less likely to make a mistake when it predicts class y. Like AdaBoost, our method is guaranteed to boost weak learners into strong learners. An empirical study on AdaBoost and one of its multi-class versions, SAMME, demonstrates the superiority of our method on datasets with more than 1,000 instances as well as on datasets with more than three classes.

Cite

Text

Nikravan et al. "Precision-Based Boosting." AAAI Conference on Artificial Intelligence, 2021. doi:10.1609/AAAI.V35I10.17105

Markdown

[Nikravan et al. "Precision-Based Boosting." AAAI Conference on Artificial Intelligence, 2021.](https://mlanthology.org/aaai/2021/nikravan2021aaai-precision/) doi:10.1609/AAAI.V35I10.17105

BibTeX

@inproceedings{nikravan2021aaai-precision,
  title     = {{Precision-Based Boosting}},
  author    = {Nikravan, Mohammad Hossein and Movahedan, Marjan and Zilles, Sandra},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2021},
  pages     = {9153-9160},
  doi       = {10.1609/AAAI.V35I10.17105},
  url       = {https://mlanthology.org/aaai/2021/nikravan2021aaai-precision/}
}