Continuous Nonlinear Dimensionality Reduction by Kernel Eigenmaps

Abstract

We equate nonlinear dimensionality reduction (NLDR) to graph embedding with side information about the vertices, and derive a solution to either problem in the form of a kernel-based mixture of affine maps from the ambient space to the target space. Unlike most spectral NLDR methods, the central eigenproblem can be made relatively small, and the result is a continuous mapping defined over the entire space, not just the datapoints. A demonstration is made to visualizing the distribution of word usages (as a proxy to word meanings) in a sample of the machine learning literature. 1 Background: Graph embcddings Consider a connected graph with weighted undirected edges specified by edge matrix W. Let be the positive edge weight between connected vertices i and j zero otherwise. Let D = diag(Wl) be a diagonal matrix where the cumulative edge weights into vertex /. The following points are well known or easily derived in spectral

Cite

Text

Brand. "Continuous Nonlinear Dimensionality Reduction by Kernel Eigenmaps." International Joint Conference on Artificial Intelligence, 2003.

Markdown

[Brand. "Continuous Nonlinear Dimensionality Reduction by Kernel Eigenmaps." International Joint Conference on Artificial Intelligence, 2003.](https://mlanthology.org/ijcai/2003/brand2003ijcai-continuous/)

BibTeX

@inproceedings{brand2003ijcai-continuous,
  title     = {{Continuous Nonlinear Dimensionality Reduction by Kernel Eigenmaps}},
  author    = {Brand, Matthew},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2003},
  pages     = {547-554},
  url       = {https://mlanthology.org/ijcai/2003/brand2003ijcai-continuous/}
}