ShareBoost: Efficient Multiclass Learning with Feature Sharing

Abstract

Multiclass prediction is the problem of classifying an object into a relevant target class. We consider the problem of learning a multiclass predictor that uses only few features, and in particular, the number of used features should increase sub-linearly with the number of possible classes. This implies that features should be shared by several classes. We describe and analyze the ShareBoost algorithm for learning a multiclass predictor that uses few shared features. We prove that ShareBoost efficiently finds a predictor that uses few shared features (if such a predictor exists) and that it has a small generalization error. We also describe how to use ShareBoost for learning a non-linear predictor that has a fast evaluation time. In a series of experiments with natural data sets we demonstrate the benefits of ShareBoost and evaluate its success relatively to other state-of-the-art approaches.

Cite

Text

Shalev-shwartz et al. "ShareBoost: Efficient Multiclass Learning with Feature Sharing." Neural Information Processing Systems, 2011.

Markdown

[Shalev-shwartz et al. "ShareBoost: Efficient Multiclass Learning with Feature Sharing." Neural Information Processing Systems, 2011.](https://mlanthology.org/neurips/2011/shalevshwartz2011neurips-shareboost/)

BibTeX

@inproceedings{shalevshwartz2011neurips-shareboost,
  title     = {{ShareBoost: Efficient Multiclass Learning with Feature Sharing}},
  author    = {Shalev-shwartz, Shai and Wexler, Yonatan and Shashua, Amnon},
  booktitle = {Neural Information Processing Systems},
  year      = {2011},
  pages     = {1179-1187},
  url       = {https://mlanthology.org/neurips/2011/shalevshwartz2011neurips-shareboost/}
}