Minimax Embeddings

Abstract

Spectral methods for nonlinear dimensionality reduction (NLDR) impose a neighborhood graph on point data and compute eigenfunctions of a quadratic form generated from the graph. We introduce a more general and more robust formulation of NLDR based on the singular value de- composition (SVD). In this framework, most spectral NLDR principles can be recovered by taking a subset of the constraints in a quadratic form built from local nullspaces on the manifold. The minimax formulation also opens up an interesting class of methods in which the graph is “dec- orated” with information at the vertices, offering discrete or continuous maps, reduced computational complexity, and immunity to some solu- tion instabilities of eigenfunction approaches. Apropos, we show almost all NLDR methods based on eigenvalue decompositions (EVD) have a so- lution instability that increases faster than problem size. This pathology can be observed (and corrected via the minimax formulation) in problems as small as N < 100 points.

Cite

Text

Brand. "Minimax Embeddings." Neural Information Processing Systems, 2003.

Markdown

[Brand. "Minimax Embeddings." Neural Information Processing Systems, 2003.](https://mlanthology.org/neurips/2003/brand2003neurips-minimax/)

BibTeX

@inproceedings{brand2003neurips-minimax,
  title     = {{Minimax Embeddings}},
  author    = {Brand, Matthew},
  booktitle = {Neural Information Processing Systems},
  year      = {2003},
  pages     = {505-512},
  url       = {https://mlanthology.org/neurips/2003/brand2003neurips-minimax/}
}