Unifying the Error-Correcting and Output-Code AdaBoost Within the Margin Framework
Abstract
In this paper, we present a new interpretation of AdaBoost. ECC and AdaBoost. OC. We show that AdaBoost. ECC performs stage-wise functional gradient descent on a cost function, defined in the domain of margin values, and that AdaBoost. OC is a shrinkage version of AdaBoost. ECC. These findings strictly explain some properties of the two algorithms. The gradient-minimization formulation of AdaBoost. ECC allows us to derive a new algorithm, referred to as AdaBoost. SECC, by explicitly exploiting shrinkage as regularization in AdaBoost. ECC. Experiments on diverse databases confirm our theoretical findings. Empirical results show that AdaBoost. SECC performs significantly better than AdaBoost. ECC and AdaBoost. OC.
Cite
Text
Sun et al. "Unifying the Error-Correcting and Output-Code AdaBoost Within the Margin Framework." International Conference on Machine Learning, 2005. doi:10.1145/1102351.1102461Markdown
[Sun et al. "Unifying the Error-Correcting and Output-Code AdaBoost Within the Margin Framework." International Conference on Machine Learning, 2005.](https://mlanthology.org/icml/2005/sun2005icml-unifying/) doi:10.1145/1102351.1102461BibTeX
@inproceedings{sun2005icml-unifying,
title = {{Unifying the Error-Correcting and Output-Code AdaBoost Within the Margin Framework}},
author = {Sun, Yijun and Todorovic, Sinisa and Li, Jian and Wu, Dapeng},
booktitle = {International Conference on Machine Learning},
year = {2005},
pages = {872-879},
doi = {10.1145/1102351.1102461},
url = {https://mlanthology.org/icml/2005/sun2005icml-unifying/}
}