How to Make AdaBoost.M1 Work for Weak Base Classifiers by Changing Only One Line of the Code
Abstract
If one has a multiclass classification problem and wants to boost a multiclass base classifier AdaBoost. M1 is a well known and widely applicated boosting algorithm. However AdaBoost. M1 does not work, if the base classifier is too weak. We show, that with a modification of only one line of AdaBoost. M1 one can make it usable for weak base classifiers, too. The resulting classifier AdaBoost. M1Wis guaranteed to minimize an upper bound for a performance measure, called the guessing error, as long as the base classifier is better than random guessing. The usability of AdaBoost. M1W could be clearly demonstrated experimentally.
Cite
Text
Eibl and Pfeiffer. "How to Make AdaBoost.M1 Work for Weak Base Classifiers by Changing Only One Line of the Code." European Conference on Machine Learning, 2002. doi:10.1007/3-540-36755-1_7Markdown
[Eibl and Pfeiffer. "How to Make AdaBoost.M1 Work for Weak Base Classifiers by Changing Only One Line of the Code." European Conference on Machine Learning, 2002.](https://mlanthology.org/ecmlpkdd/2002/eibl2002ecml-make/) doi:10.1007/3-540-36755-1_7BibTeX
@inproceedings{eibl2002ecml-make,
title = {{How to Make AdaBoost.M1 Work for Weak Base Classifiers by Changing Only One Line of the Code}},
author = {Eibl, Günther and Pfeiffer, Karl Peter},
booktitle = {European Conference on Machine Learning},
year = {2002},
pages = {72-83},
doi = {10.1007/3-540-36755-1_7},
url = {https://mlanthology.org/ecmlpkdd/2002/eibl2002ecml-make/}
}