A Convex Relaxation for Weakly Supervised Classifiers
Abstract
This paper introduces a general multi-class approach to weakly supervised classification. Inferring the labels and learning the parameters of the model is usually done jointly through a block-coordinate descent algorithm such as expectation-maximization (EM), which may lead to local minima. To avoid this problem, we propose a cost function based on a convex relaxation of the soft-max loss. We then propose an algorithm specifically designed to efficiently solve the corresponding semidefinite program (SDP). Empirically, our method compares favorably to standard ones on different datasets for multiple instance learning and semi-supervised learning, as well as on clustering tasks.
Cite
Text
Joulin and Bach. "A Convex Relaxation for Weakly Supervised Classifiers." International Conference on Machine Learning, 2012.Markdown
[Joulin and Bach. "A Convex Relaxation for Weakly Supervised Classifiers." International Conference on Machine Learning, 2012.](https://mlanthology.org/icml/2012/joulin2012icml-convex/)BibTeX
@inproceedings{joulin2012icml-convex,
title = {{A Convex Relaxation for Weakly Supervised Classifiers}},
author = {Joulin, Armand and Bach, Francis R.},
booktitle = {International Conference on Machine Learning},
year = {2012},
url = {https://mlanthology.org/icml/2012/joulin2012icml-convex/}
}