Regularizing AdaBoost
Abstract
Boosting methods maximize a hard classification margin and are known as powerful techniques that do not exhibit overfitting for low noise cases. Also for noisy data boosting will try to enforce a hard margin and thereby give too much weight to outliers, which then leads to the dilemma of non-smooth fits and overfitting. Therefore we propose three algorithms to allow for soft margin classification by introducing regularization with slack variables into the boosting concept: (1) AdaBoostreg and regularized versions of (2) linear and (3) quadratic programming AdaBoost. Experiments show the usefulness of the proposed algorithms in comparison to another soft margin classifier: the support vector machine.
Cite
Text
Rätsch et al. "Regularizing AdaBoost." Neural Information Processing Systems, 1998.Markdown
[Rätsch et al. "Regularizing AdaBoost." Neural Information Processing Systems, 1998.](https://mlanthology.org/neurips/1998/ratsch1998neurips-regularizing/)BibTeX
@inproceedings{ratsch1998neurips-regularizing,
title = {{Regularizing AdaBoost}},
author = {Rätsch, Gunnar and Onoda, Takashi and Müller, Klaus R.},
booktitle = {Neural Information Processing Systems},
year = {1998},
pages = {564-570},
url = {https://mlanthology.org/neurips/1998/ratsch1998neurips-regularizing/}
}