The Elastic Embedding Algorithm for Dimensionality Reduction
Abstract
We propose a new dimensionality reduction method, the elastic embedding (EE), that optimises an intuitive, nonlinear objective function of the low-dimensional coordinates of the data. The method reveals a fundamental relation betwen a spectral method, Laplacian eigenmaps, and a nonlinear method, stochastic neighbour embedding; and shows that EE can be seen as learning both the coordinates and the affinities between data points. We give a homotopy method to train EE, characterise the critical value of the homotopy parameter, and study the method's behaviour. For a fixed homotopy parameter, we give a globally convergent iterative algorithm that is very effective and requires no user parameters. Finally, we give an extension to out-of-sample points. In standard datasets, EE obtains results as good or better than those of SNE, but more efficiently and robustly.
Cite
Text
Carreira-Perpiñán. "The Elastic Embedding Algorithm for Dimensionality Reduction." International Conference on Machine Learning, 2010.Markdown
[Carreira-Perpiñán. "The Elastic Embedding Algorithm for Dimensionality Reduction." International Conference on Machine Learning, 2010.](https://mlanthology.org/icml/2010/carreiraperpinan2010icml-elastic/)BibTeX
@inproceedings{carreiraperpinan2010icml-elastic,
title = {{The Elastic Embedding Algorithm for Dimensionality Reduction}},
author = {Carreira-Perpiñán, Miguel Á.},
booktitle = {International Conference on Machine Learning},
year = {2010},
pages = {167-174},
url = {https://mlanthology.org/icml/2010/carreiraperpinan2010icml-elastic/}
}