Semi-Supervised Learning by Entropy Minimization

Abstract

We consider the semi-supervised learning problem, where a decision rule is to be learned from labeled and unlabeled data. In this framework, we motivate minimum entropy regularization, which enables to incorporate unlabeled data in the standard supervised learning. Our approach in- cludes other approaches to the semi-supervised problem as particular or limiting cases. A series of experiments illustrates that the proposed solu- tion benefits from unlabeled data. The method challenges mixture mod- els when the data are sampled from the distribution class spanned by the generative model. The performances are definitely in favor of minimum entropy regularization when generative models are misspecified, and the weighting of unlabeled data provides robustness to the violation of the "cluster assumption". Finally, we also illustrate that the method can also be far superior to manifold learning in high dimension spaces.

Cite

Text

Grandvalet and Bengio. "Semi-Supervised Learning by Entropy Minimization." Neural Information Processing Systems, 2004.

Markdown

[Grandvalet and Bengio. "Semi-Supervised Learning by Entropy Minimization." Neural Information Processing Systems, 2004.](https://mlanthology.org/neurips/2004/grandvalet2004neurips-semisupervised/)

BibTeX

@inproceedings{grandvalet2004neurips-semisupervised,
  title     = {{Semi-Supervised Learning by Entropy Minimization}},
  author    = {Grandvalet, Yves and Bengio, Yoshua},
  booktitle = {Neural Information Processing Systems},
  year      = {2004},
  pages     = {529-536},
  url       = {https://mlanthology.org/neurips/2004/grandvalet2004neurips-semisupervised/}
}