Semi-Supervised Nonlinear Dimensionality Reduction

Abstract

The problem of nonlinear dimensionality reduction is considered. We focus on problems where prior information is available, namely, semi-supervised dimensionality reduction. It is shown that basic nonlinear dimensionality reduction algorithms, such as Locally Linear Embedding (LLE), Isometric feature mapping (ISOMAP), and Local Tangent Space Alignment (LTSA), can be modified by taking into account prior information on exact mapping of certain data points. The sensitivity analysis of our algorithms shows that prior information will improve stability of the solution. We also give some insight on what kind of prior information best improves the solution. We demonstrate the usefulness of our algorithm by synthetic and real life examples.

Cite

Text

Yang et al. "Semi-Supervised Nonlinear Dimensionality Reduction." International Conference on Machine Learning, 2006. doi:10.1145/1143844.1143978

Markdown

[Yang et al. "Semi-Supervised Nonlinear Dimensionality Reduction." International Conference on Machine Learning, 2006.](https://mlanthology.org/icml/2006/yang2006icml-semi/) doi:10.1145/1143844.1143978

BibTeX

@inproceedings{yang2006icml-semi,
  title     = {{Semi-Supervised Nonlinear Dimensionality Reduction}},
  author    = {Yang, Xin and Fu, Haoying and Zha, Hongyuan and Barlow, Jesse L.},
  booktitle = {International Conference on Machine Learning},
  year      = {2006},
  pages     = {1065-1072},
  doi       = {10.1145/1143844.1143978},
  url       = {https://mlanthology.org/icml/2006/yang2006icml-semi/}
}