On the Convergence of Eigenspaces in Kernel Principal Component Analysis
Abstract
This paper presents a non-asymptotic statistical analysis of Kernel-PCA with a focus different from the one proposed in previous work on this topic. Here instead of considering the reconstruction error of KPCA we are interested in approximation error bounds for the eigenspaces themselves. We prove an upper bound depending on the spacing between eigenvalues but not on the dimensionality of the eigenspace. As a consequence this allows to infer stability results for these estimated spaces.
Cite
Text
Zwald and Blanchard. "On the Convergence of Eigenspaces in Kernel Principal Component Analysis." Neural Information Processing Systems, 2005.Markdown
[Zwald and Blanchard. "On the Convergence of Eigenspaces in Kernel Principal Component Analysis." Neural Information Processing Systems, 2005.](https://mlanthology.org/neurips/2005/zwald2005neurips-convergence/)BibTeX
@inproceedings{zwald2005neurips-convergence,
title = {{On the Convergence of Eigenspaces in Kernel Principal Component Analysis}},
author = {Zwald, Laurent and Blanchard, Gilles},
booktitle = {Neural Information Processing Systems},
year = {2005},
pages = {1649-1656},
url = {https://mlanthology.org/neurips/2005/zwald2005neurips-convergence/}
}