Nonrigid Embeddings for Dimensionality Reduction

Abstract

Spectral methods for embedding graphs and immersing data manifolds in low-dimensional spaces are notoriously unstable due to insufficient and/or numerically ill-conditioned constraint sets. Why show why this is endemic to spectral methods, and develop low-complexity solutions for stiffening ill-conditioned problems and regularizing ill-posed problems, with proofs of correctness. The regularization exploits sparse but complementary constraints on affine rigidity and edge lengths to obtain isometric embeddings. An implemented algorithm is fast, accurate, and industrial-strength: Experiments with problem sizes spanning four orders of magnitude show O ( N ) scaling. We demonstrate with speech data.

Cite

Text

Brand. "Nonrigid Embeddings for Dimensionality Reduction." European Conference on Machine Learning, 2005. doi:10.1007/11564096_10

Markdown

[Brand. "Nonrigid Embeddings for Dimensionality Reduction." European Conference on Machine Learning, 2005.](https://mlanthology.org/ecmlpkdd/2005/brand2005ecml-nonrigid/) doi:10.1007/11564096_10

BibTeX

@inproceedings{brand2005ecml-nonrigid,
  title     = {{Nonrigid Embeddings for Dimensionality Reduction}},
  author    = {Brand, Matthew},
  booktitle = {European Conference on Machine Learning},
  year      = {2005},
  pages     = {47-59},
  doi       = {10.1007/11564096_10},
  url       = {https://mlanthology.org/ecmlpkdd/2005/brand2005ecml-nonrigid/}
}