Analysis and Extension of Spectral Methods for Nonlinear Dimensionality Reduction

Abstract

Many unsupervised algorithms for nonlinear dimensionality reduction, such as locally linear embedding (LLE) and Laplacian eigenmaps, are derived from the spectral decompositions of sparse matrices. While these algorithms aim to preserve certain proximity relations on average, their embeddings are not explicitly designed to preserve local features such as distances or angles. In this paper, we show how to construct a low dimensional embedding that maximally preserves angles between nearby data points. The embedding is derived from the bottom eigenvectors of LLE and/or Laplacian eigenmaps by solving an additional (but small) problem in semidefinite programming, whose size is independent of the number of data points. The solution obtained by semidefinite programming also yields an estimate of the data's intrinsic dimensionality. Experimental results on several data sets demonstrate the merits of our approach.

Cite

Text

Sha and Saul. "Analysis and Extension of Spectral Methods for Nonlinear Dimensionality Reduction." International Conference on Machine Learning, 2005. doi:10.1145/1102351.1102450

Markdown

[Sha and Saul. "Analysis and Extension of Spectral Methods for Nonlinear Dimensionality Reduction." International Conference on Machine Learning, 2005.](https://mlanthology.org/icml/2005/sha2005icml-analysis/) doi:10.1145/1102351.1102450

BibTeX

@inproceedings{sha2005icml-analysis,
  title     = {{Analysis and Extension of Spectral Methods for Nonlinear Dimensionality Reduction}},
  author    = {Sha, Fei and Saul, Lawrence K.},
  booktitle = {International Conference on Machine Learning},
  year      = {2005},
  pages     = {784-791},
  doi       = {10.1145/1102351.1102450},
  url       = {https://mlanthology.org/icml/2005/sha2005icml-analysis/}
}