An Introduction to Nonlinear Dimensionality Reduction by Maximum Variance Unfolding
Abstract
Many problems in AI are simplified by clever representations of sensory or symbolic input. How to discover such representations automatically, from large amounts of unlabeled data, remains a fundamental challenge. The goal of statistical methods for dimensionality reduction is to detect and discover low dimensional structure in high dimensional data. In this paper, we review a recently proposed algorithm-- maximum, variance unfolding--for learning faithful low dimensional representations of high dimensional data. The algorithm relies on modem tools in convex optimization that are proving increasingly useful in many areas of machine learning.
Cite
Text
Weinberger and Saul. "An Introduction to Nonlinear Dimensionality Reduction by Maximum Variance Unfolding." AAAI Conference on Artificial Intelligence, 2006.Markdown
[Weinberger and Saul. "An Introduction to Nonlinear Dimensionality Reduction by Maximum Variance Unfolding." AAAI Conference on Artificial Intelligence, 2006.](https://mlanthology.org/aaai/2006/weinberger2006aaai-introduction/)BibTeX
@inproceedings{weinberger2006aaai-introduction,
title = {{An Introduction to Nonlinear Dimensionality Reduction by Maximum Variance Unfolding}},
author = {Weinberger, Kilian Q. and Saul, Lawrence K.},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2006},
pages = {1683-1686},
url = {https://mlanthology.org/aaai/2006/weinberger2006aaai-introduction/}
}