Out-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering
Abstract
Several unsupervised learning algorithms based on an eigendecompo- sition provide either an embedding or a clustering only for given train- ing points, with no straightforward extension for out-of-sample examples short of recomputing eigenvectors. This paper provides a unified frame- work for extending Local Linear Embedding (LLE), Isomap, Laplacian Eigenmaps, Multi-Dimensional Scaling (for dimensionality reduction) as well as for Spectral Clustering. This framework is based on seeing these algorithms as learning eigenfunctions of a data-dependent kernel. Numerical experiments show that the generalizations performed have a level of error comparable to the variability of the embedding algorithms due to the choice of training data.
Cite
Text
Bengio et al. "Out-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering." Neural Information Processing Systems, 2003.Markdown
[Bengio et al. "Out-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering." Neural Information Processing Systems, 2003.](https://mlanthology.org/neurips/2003/bengio2003neurips-outofsample/)BibTeX
@inproceedings{bengio2003neurips-outofsample,
title = {{Out-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering}},
author = {Bengio, Yoshua and Paiement, Jean-françcois and Vincent, Pascal and Delalleau, Olivier and Roux, Nicolas L. and Ouimet, Marie},
booktitle = {Neural Information Processing Systems},
year = {2003},
pages = {177-184},
url = {https://mlanthology.org/neurips/2003/bengio2003neurips-outofsample/}
}