On Manifold Regularization
Abstract
We propose a family of learning algorithms based on a new form of regularization that allows us to exploit the geometry of the marginal distribution. We focus on a semisupervised framework that incorporates labeled and unlabeled data in a generalpurpose learner. Some transductive graph learning algorithms and standard methods including Support Vector Machines and Regularized Least Squares can be obtained as special cases. We utilize properties of Reproducing Kernel Hilbert spaces to prove new Representer theorems that provide theoretical basis for the algorithms. As a result (in contrast to purely graph based approaches) we obtain a natural out-of-sample extension to novel examples and are thus able to handle both transductive and truly semi-supervised settings. We present experimental evidence suggesting that our semisupervised algorithms are able to use unlabeled data effectively. In the absence of labeled examples, our framework gives rise to a regularized form of spectral clustering with an out-of-sample extension.
Cite
Text
Belkin et al. "On Manifold Regularization." Proceedings of the Tenth International Workshop on Artificial Intelligence and Statistics, 2005.Markdown
[Belkin et al. "On Manifold Regularization." Proceedings of the Tenth International Workshop on Artificial Intelligence and Statistics, 2005.](https://mlanthology.org/aistats/2005/belkin2005aistats-manifold/)BibTeX
@inproceedings{belkin2005aistats-manifold,
title = {{On Manifold Regularization}},
author = {Belkin, Misha and Niyogi, Partha and Sindhwani, Vikas},
booktitle = {Proceedings of the Tenth International Workshop on Artificial Intelligence and Statistics},
year = {2005},
pages = {17-24},
volume = {R5},
url = {https://mlanthology.org/aistats/2005/belkin2005aistats-manifold/}
}