Fast Iterative Kernel PCA
Abstract
We introduce two methods to improve convergence of the Kernel Hebbian Algorithm (KHA) for iterative kernel PCA. KHA has a scalar gain parameter which is either held constant or decreased as 1/t, leading to slow convergence. Our KHA/et algorithm accelerates KHA by incorporating the reciprocal of the current estimated eigenvalues as a gain vector. We then derive and apply Stochastic MetaDescent (SMD) to KHA/et; this further speeds convergence by performing gain adaptation in RKHS. Experimental results for kernel PCA and spectral clustering of USPS digits as well as motion capture and image de-noising problems confirm that our methods converge substantially faster than conventional KHA.
Cite
Text
Schraudolph et al. "Fast Iterative Kernel PCA." Neural Information Processing Systems, 2006.Markdown
[Schraudolph et al. "Fast Iterative Kernel PCA." Neural Information Processing Systems, 2006.](https://mlanthology.org/neurips/2006/schraudolph2006neurips-fast/)BibTeX
@inproceedings{schraudolph2006neurips-fast,
title = {{Fast Iterative Kernel PCA}},
author = {Schraudolph, Nicol N. and Günter, Simon and Vishwanathan, S.v.n.},
booktitle = {Neural Information Processing Systems},
year = {2006},
pages = {1225-1232},
url = {https://mlanthology.org/neurips/2006/schraudolph2006neurips-fast/}
}