Squared-Loss Mutual Information Regularization: A Novel Information-Theoretic Approach to Semi-Supervised Learning

Abstract

We propose squared-loss mutual information regularization (SMIR) for multi-class probabilistic classification, following the information maximization principle. SMIR is convex under mild conditions and thus improves the nonconvexity of mutual information regularization. It offers all of the following four abilities to semi-supervised algorithms: Analytical solution, out-of-sample/multi-class classification, and probabilistic output. Furthermore, novel generalization error bounds are derived. Experiments show SMIR compares favorably with state-of-the-art methods.

Cite

Text

Niu et al. "Squared-Loss Mutual Information Regularization: A Novel Information-Theoretic Approach to Semi-Supervised Learning." International Conference on Machine Learning, 2013.

Markdown

[Niu et al. "Squared-Loss Mutual Information Regularization: A Novel Information-Theoretic Approach to Semi-Supervised Learning." International Conference on Machine Learning, 2013.](https://mlanthology.org/icml/2013/niu2013icml-squaredloss/)

BibTeX

@inproceedings{niu2013icml-squaredloss,
  title     = {{Squared-Loss Mutual Information Regularization: A Novel Information-Theoretic Approach to Semi-Supervised Learning}},
  author    = {Niu, Gang and Jitkrittum, Wittawat and Dai, Bo and Hachiya, Hirotaka and Sugiyama, Masashi},
  booktitle = {International Conference on Machine Learning},
  year      = {2013},
  pages     = {10-18},
  volume    = {28},
  url       = {https://mlanthology.org/icml/2013/niu2013icml-squaredloss/}
}