A Cure for Variance Inflation in High Dimensional Kernel Principal Component Analysis
Abstract
Small sample high-dimensional principal component analysis (PCA) suffers from variance inflation and lack of generalizability. It has earlier been pointed out that a simple leave-one-out variance renormalization scheme can cure the problem. In this paper we generalize the cure in two directions: First, we propose a computationally less intensive approximate leave-one-out estimator, secondly, we show that variance inflation is also present in kernel principal component analysis (kPCA) and we provide a non-parametric renormalization scheme which can quite efficiently restore generalizability in kPCA. As for PCA our analysis also suggests a simplified approximate expression.
Cite
Text
Abrahamsen and Hansen. "A Cure for Variance Inflation in High Dimensional Kernel Principal Component Analysis." Journal of Machine Learning Research, 2011.Markdown
[Abrahamsen and Hansen. "A Cure for Variance Inflation in High Dimensional Kernel Principal Component Analysis." Journal of Machine Learning Research, 2011.](https://mlanthology.org/jmlr/2011/abrahamsen2011jmlr-cure/)BibTeX
@article{abrahamsen2011jmlr-cure,
title = {{A Cure for Variance Inflation in High Dimensional Kernel Principal Component Analysis}},
author = {Abrahamsen, Trine Julie and Hansen, Lars Kai},
journal = {Journal of Machine Learning Research},
year = {2011},
pages = {2027-2044},
volume = {12},
url = {https://mlanthology.org/jmlr/2011/abrahamsen2011jmlr-cure/}
}