Semi-Supervised Learning via Compact Latent Space Clustering

Abstract

We present a novel cost function for semi-supervised learning of neural networks that encourages compact clustering of the latent space to facilitate separation. The key idea is to dynamically create a graph over embeddings of labeled and unlabeled samples of a training batch to capture underlying structure in feature space, and use label propagation to estimate its high and low density regions. We then devise a cost function based on Markov chains on the graph that regularizes the latent space to form a single compact cluster per class, while avoiding to disturb existing clusters during optimization. We evaluate our approach on three benchmarks and compare to state-of-the art with promising results. Our approach combines the benefits of graph-based regularization with efficient, inductive inference, does not require modifications to a network architecture, and can thus be easily applied to existing networks to enable an effective use of unlabeled data.

Cite

Text

Kamnitsas et al. "Semi-Supervised Learning via Compact Latent Space Clustering." International Conference on Machine Learning, 2018.

Markdown

[Kamnitsas et al. "Semi-Supervised Learning via Compact Latent Space Clustering." International Conference on Machine Learning, 2018.](https://mlanthology.org/icml/2018/kamnitsas2018icml-semisupervised/)

BibTeX

@inproceedings{kamnitsas2018icml-semisupervised,
  title     = {{Semi-Supervised Learning via Compact Latent Space Clustering}},
  author    = {Kamnitsas, Konstantinos and Castro, Daniel and Le Folgoc, Loic and Walker, Ian and Tanno, Ryutaro and Rueckert, Daniel and Glocker, Ben and Criminisi, Antonio and Nori, Aditya},
  booktitle = {International Conference on Machine Learning},
  year      = {2018},
  pages     = {2459-2468},
  volume    = {80},
  url       = {https://mlanthology.org/icml/2018/kamnitsas2018icml-semisupervised/}
}