Dimensionality Reduction and Generalization
Abstract
In this paper we investigate the regularization property of Kernel Principal Component Analysis (KPCA), by studying its application as a preprocessing step to supervised learning problems. We show that performing KPCA and then ordinary least squares on the pro jected data, a procedure known as kernel principal component regression (KPCR), is equivalent to spectral cut-off regularization, the regularization parameter being exactly the number of principal components to keep. Using probabilistic estimates for integral operators we can prove error estimates for KPCR and propose a parameter choice procedure allowing to prove consistency of the algorithm.
Cite
Text
Mosci et al. "Dimensionality Reduction and Generalization." International Conference on Machine Learning, 2007. doi:10.1145/1273496.1273579Markdown
[Mosci et al. "Dimensionality Reduction and Generalization." International Conference on Machine Learning, 2007.](https://mlanthology.org/icml/2007/mosci2007icml-dimensionality/) doi:10.1145/1273496.1273579BibTeX
@inproceedings{mosci2007icml-dimensionality,
title = {{Dimensionality Reduction and Generalization}},
author = {Mosci, Sofia and Rosasco, Lorenzo and Verri, Alessandro},
booktitle = {International Conference on Machine Learning},
year = {2007},
pages = {657-664},
doi = {10.1145/1273496.1273579},
url = {https://mlanthology.org/icml/2007/mosci2007icml-dimensionality/}
}