Learning a Kernel Matrix for Nonlinear Dimensionality Reduction
Abstract
We investigate how to learn a kernel matrix for high dimensional data thatlies on or near a low dimensional manifold. Noting that the kernel matriximplicitly maps the data into a nonlinear feature space, we show how todiscover a mapping that "unfolds" the nderlying manifold from which the datawas sampled. The kernel matrix is constructed by maximizing the variance infeature space subject to local constraints that preserve the angles anddistances between nearest neighbors. The main optimization involves aninstance of semidefinite programming---a fundamentally different computationthan previous algorithms for manifold learning, such as Isomap and locallylinear embedding. The optimized kernels perform better than polynomial and Gaussian kernels for problems in manifold learning, but worse for problems inlarge margin classification. We explain these results in terms of thegeometric properties of different kernels and comment on variousinterpretations of other manifold learning algorithms as kernel methods.
Cite
Text
Weinberger et al. "Learning a Kernel Matrix for Nonlinear Dimensionality Reduction." International Conference on Machine Learning, 2004. doi:10.1145/1015330.1015345Markdown
[Weinberger et al. "Learning a Kernel Matrix for Nonlinear Dimensionality Reduction." International Conference on Machine Learning, 2004.](https://mlanthology.org/icml/2004/weinberger2004icml-learning/) doi:10.1145/1015330.1015345BibTeX
@inproceedings{weinberger2004icml-learning,
title = {{Learning a Kernel Matrix for Nonlinear Dimensionality Reduction}},
author = {Weinberger, Kilian Q. and Sha, Fei and Saul, Lawrence K.},
booktitle = {International Conference on Machine Learning},
year = {2004},
doi = {10.1145/1015330.1015345},
url = {https://mlanthology.org/icml/2004/weinberger2004icml-learning/}
}