Regularized Multi-Class Semi-Supervised Boosting

Abstract

Many semi-supervised learning algorithms only deal with binary classification. Their extension to the multi-class problem is usually obtained by repeatedly solving a set of binary problems. Additionally, many of these methods do not scale very well with respect to a large number of unlabeled samples, which limits their applications to large-scale problems with many classes and unlabeled samples. In this paper, we directly address the multi-class semi-supervised learning problem by an efficient boosting method. In particular, we introduce a new multi-class margin-maximizing loss function for the unlabeled data and use the generalized expectation regularization for incorporating cluster priors into the model. Our approach enables efficient usage of very large data sets. The performance and efficiency of our method is demonstrated on both standard machine learning data sets as well as on challenging object categorization tasks.

Cite

Text

Saffari et al. "Regularized Multi-Class Semi-Supervised Boosting." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2009. doi:10.1109/CVPR.2009.5206715

Markdown

[Saffari et al. "Regularized Multi-Class Semi-Supervised Boosting." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2009.](https://mlanthology.org/cvpr/2009/saffari2009cvpr-regularized/) doi:10.1109/CVPR.2009.5206715

BibTeX

@inproceedings{saffari2009cvpr-regularized,
  title     = {{Regularized Multi-Class Semi-Supervised Boosting}},
  author    = {Saffari, Amir and Leistner, Christian and Bischof, Horst},
  booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  year      = {2009},
  pages     = {967-974},
  doi       = {10.1109/CVPR.2009.5206715},
  url       = {https://mlanthology.org/cvpr/2009/saffari2009cvpr-regularized/}
}