An Iterative Locally Linear Embedding Algorithm

Abstract

Locally Linear embedding (LLE) is a popular dimension reduction method. In this paper, we systematically improve the two main steps of LLE: (A) learning the graph weights W, and (B) learning the embedding Y. We propose a sparse nonnegative W learning algorithm. We propose a weighted formulation for learning Y and show the results are identical to normalized cuts spectral clustering. We further propose to iterate the two steps in LLE repeatedly to improve the results. Extensive experiment results show that iterative LLE algorithm significantly improves both classification and clustering results.

Cite

Text

Kong et al. "An Iterative Locally Linear Embedding Algorithm." International Conference on Machine Learning, 2012.

Markdown

[Kong et al. "An Iterative Locally Linear Embedding Algorithm." International Conference on Machine Learning, 2012.](https://mlanthology.org/icml/2012/kong2012icml-iterative/)

BibTeX

@inproceedings{kong2012icml-iterative,
  title     = {{An Iterative Locally Linear Embedding Algorithm}},
  author    = {Kong, Deguang and Ding, Chris H. Q. and Huang, Heng and Nie, Feiping},
  booktitle = {International Conference on Machine Learning},
  year      = {2012},
  url       = {https://mlanthology.org/icml/2012/kong2012icml-iterative/}
}