Robust Non-Linear Dimensionality Reduction Using Successive 1-Dimensional Laplacian Eigenmaps

Abstract

Non-linear dimensionality reduction of noisy data is a challenging problem encountered in a variety of data analysis applications. Recent results in the literature show that spectral decomposition, as used for example by the Laplacian Eigenmaps algorithm, provides a powerful tool for non-linear dimensionality reduction and manifold learning. In this paper, we discuss a significant shortcoming of these approaches, which we refer to as the repeated eigendirections problem. We propose a novel approach that combines successive 1dimensional spectral embeddings with a data advection scheme that allows us to address this problem. The proposed method does not depend on a non-linear optimization scheme; hence, it is not prone to local minima. Experiments with artificial and real data illustrate the advantages of the proposed method over existing approaches. We also demonstrate that the approach is capable of correctly learning manifolds corrupted by significant amounts of noise.

Cite

Text

Gerber et al. "Robust Non-Linear Dimensionality Reduction Using Successive 1-Dimensional Laplacian Eigenmaps." International Conference on Machine Learning, 2007. doi:10.1145/1273496.1273532

Markdown

[Gerber et al. "Robust Non-Linear Dimensionality Reduction Using Successive 1-Dimensional Laplacian Eigenmaps." International Conference on Machine Learning, 2007.](https://mlanthology.org/icml/2007/gerber2007icml-robust/) doi:10.1145/1273496.1273532

BibTeX

@inproceedings{gerber2007icml-robust,
  title     = {{Robust Non-Linear Dimensionality Reduction Using Successive 1-Dimensional Laplacian Eigenmaps}},
  author    = {Gerber, Samuel and Tasdizen, Tolga and Whitaker, Ross T.},
  booktitle = {International Conference on Machine Learning},
  year      = {2007},
  pages     = {281-288},
  doi       = {10.1145/1273496.1273532},
  url       = {https://mlanthology.org/icml/2007/gerber2007icml-robust/}
}