Multiclass Boosting Algorithms for Shrinkage Estimators of Class Probability

Abstract

Our purpose is to estimate conditional probabilities of output labels in multiclass classification problems. Adaboost provides highly accurate classifiers and has potential to estimate conditional probabilities. However, the conditional probability estimated by Adaboost tends to overfit to training samples. We propose loss functions for boosting that provide shrinkage estimator. The effect of regularization is realized by shrinkage of probabilities toward the uniform distribution. Numerical experiments indicate that boosting algorithms based on proposed loss functions show significantly better results than existing boosting algorithms for estimation of conditional probabilities.

Cite

Text

Kanamori. "Multiclass Boosting Algorithms for Shrinkage Estimators of Class Probability." International Conference on Algorithmic Learning Theory, 2007. doi:10.1007/978-3-540-75225-7_29

Markdown

[Kanamori. "Multiclass Boosting Algorithms for Shrinkage Estimators of Class Probability." International Conference on Algorithmic Learning Theory, 2007.](https://mlanthology.org/alt/2007/kanamori2007alt-multiclass/) doi:10.1007/978-3-540-75225-7_29

BibTeX

@inproceedings{kanamori2007alt-multiclass,
  title     = {{Multiclass Boosting Algorithms for Shrinkage Estimators of Class Probability}},
  author    = {Kanamori, Takafumi},
  booktitle = {International Conference on Algorithmic Learning Theory},
  year      = {2007},
  pages     = {358-372},
  doi       = {10.1007/978-3-540-75225-7_29},
  url       = {https://mlanthology.org/alt/2007/kanamori2007alt-multiclass/}
}