A Smoothed Boosting Algorithm Using Probabilistic Output Codes
Abstract
AdaBoost. OC has shown to be an effective method in boosting "weak" binary classifiers for multi-class learning. It employs the Error Correcting Output Code (ECOC) method to convert a multi-class learning problem into a set of binary classification problems, and applies the AdaBoost algorithm to solve them efficiently. In this paper, we propose a new boosting algorithm that improves the AdaBoost. OC algorithm in two aspects: 1) It introduces a smoothing mechanism into the boosting algorithm to alleviate the potential overfitting problem with the AdaBoost algorithm, and 2) It introduces a probabilistic coding scheme to generate binary codes for multiple classes such that training errors can be efficiently reduced. Empirical studies with seven UCI datasets have indicated that the proposed boosting algorithm is more robust and effective than the AdaBoost. OC algorithm for multi-class learning.
Cite
Text
Jin and Zhang. "A Smoothed Boosting Algorithm Using Probabilistic Output Codes." International Conference on Machine Learning, 2005. doi:10.1145/1102351.1102397Markdown
[Jin and Zhang. "A Smoothed Boosting Algorithm Using Probabilistic Output Codes." International Conference on Machine Learning, 2005.](https://mlanthology.org/icml/2005/jin2005icml-smoothed/) doi:10.1145/1102351.1102397BibTeX
@inproceedings{jin2005icml-smoothed,
title = {{A Smoothed Boosting Algorithm Using Probabilistic Output Codes}},
author = {Jin, Rong and Zhang, Jian},
booktitle = {International Conference on Machine Learning},
year = {2005},
pages = {361-368},
doi = {10.1145/1102351.1102397},
url = {https://mlanthology.org/icml/2005/jin2005icml-smoothed/}
}