Multi-Class Deep Boosting

Abstract

We present new ensemble learning algorithms for multi-class classification. Our algorithms can use as a base classifier set a family of deep decision trees or other rich or complex families and yet benefit from strong generalization guarantees. We give new data-dependent learning bounds for convex ensembles in the multi-class classification setting expressed in terms of the Rademacher complexities of the sub-families composing the base classifier set, and the mixture weight assigned to each sub-family. These bounds are finer than existing ones both thanks to an improved dependency on the number of classes and, more crucially, by virtue of a more favorable complexity term expressed as an average of the Rademacher complexities based on the ensemble’s mixture weights. We introduce and discuss several new multi-class ensemble algorithms benefiting from these guarantees, prove positive results for the H-consistency of several of them, and report the results of experiments showing that their performance compares favorably with that of multi-class versions of AdaBoost and Logistic Regression and their L1-regularized counterparts.

Cite

Text

Kuznetsov et al. "Multi-Class Deep Boosting." Neural Information Processing Systems, 2014.

Markdown

[Kuznetsov et al. "Multi-Class Deep Boosting." Neural Information Processing Systems, 2014.](https://mlanthology.org/neurips/2014/kuznetsov2014neurips-multiclass/)

BibTeX

@inproceedings{kuznetsov2014neurips-multiclass,
  title     = {{Multi-Class Deep Boosting}},
  author    = {Kuznetsov, Vitaly and Mohri, Mehryar and Syed, Umar},
  booktitle = {Neural Information Processing Systems},
  year      = {2014},
  pages     = {2501-2509},
  url       = {https://mlanthology.org/neurips/2014/kuznetsov2014neurips-multiclass/}
}