Kernel PCA and De-Noising in Feature Spaces

Abstract

Kernel PCA as a nonlinear feature extractor has proven powerful as a preprocessing step for classification algorithms. But it can also be con(cid:173) sidered as a natural generalization of linear principal component anal(cid:173) ysis. This gives rise to the question how to use nonlinear features for data compression, reconstruction, and de-noising, applications common in linear PCA. This is a nontrivial task, as the results provided by ker(cid:173) nel PCA live in some high dimensional feature space and need not have pre-images in input space. This work presents ideas for finding approxi(cid:173) mate pre-images, focusing on Gaussian kernels, and shows experimental results using these pre-images in data reconstruction and de-noising on toy examples as well as on real world data.

Cite

Text

Mika et al. "Kernel PCA and De-Noising in Feature Spaces." Neural Information Processing Systems, 1998.

Markdown

[Mika et al. "Kernel PCA and De-Noising in Feature Spaces." Neural Information Processing Systems, 1998.](https://mlanthology.org/neurips/1998/mika1998neurips-kernel/)

BibTeX

@inproceedings{mika1998neurips-kernel,
  title     = {{Kernel PCA and De-Noising in Feature Spaces}},
  author    = {Mika, Sebastian and Schölkopf, Bernhard and Smola, Alex J. and Müller, Klaus-Robert and Scholz, Matthias and Rätsch, Gunnar},
  booktitle = {Neural Information Processing Systems},
  year      = {1998},
  pages     = {536-542},
  url       = {https://mlanthology.org/neurips/1998/mika1998neurips-kernel/}
}