Fast Training of Nonlinear Embedding Algorithms
Abstract
Stochastic neighbor embedding (SNE) and related nonlinear manifold learning algorithms achieve high-quality low-dimensional representations of similarity data, but are notoriously slow to train. We propose a generic formulation of embedding algorithms that includes SNE and other existing algorithms, and study their relation with spectral methods and graph Laplacians. This allows us to define several partial-Hessian optimization strategies, characterize their global and local convergence, and evaluate them empirically. We achieve up to two orders of magnitude speedup over existing training methods with a strategy (which we call the spectral direction) that adds nearly no overhead to the gradient and yet is simple, scalable and applicable to several existing and future embedding algorithms.
Cite
Text
Vladymyrov and Carreira-Perpiñán. "Fast Training of Nonlinear Embedding Algorithms." International Conference on Machine Learning, 2012.Markdown
[Vladymyrov and Carreira-Perpiñán. "Fast Training of Nonlinear Embedding Algorithms." International Conference on Machine Learning, 2012.](https://mlanthology.org/icml/2012/vladymyrov2012icml-fast/)BibTeX
@inproceedings{vladymyrov2012icml-fast,
title = {{Fast Training of Nonlinear Embedding Algorithms}},
author = {Vladymyrov, Max and Carreira-Perpiñán, Miguel Á.},
booktitle = {International Conference on Machine Learning},
year = {2012},
url = {https://mlanthology.org/icml/2012/vladymyrov2012icml-fast/}
}