Generalization Bounds for Kernel Canonical Correlation Analysis
Abstract
We study the problem of multiview representation learning using kernel canonical correlation analysis (KCCA) and establish non-asymptotic bounds on generalization error for regularized empirical risk minimization. In particular, we give fine-grained high-probability bounds on generalization error ranging from $O(n^{-1/6})$ to $O(n^{-1/5})$ depending on underlying distributional properties, where $n$ is the number of data samples. For the special case of finite-dimensional Hilbert spaces (such as linear CCA), our rates improve, ranging from $O(n^{-1/2})$ to $O(n^{-1})$. Finally, our results generalize to the problem of functional canonical correlation analysis over abstract Hilbert spaces.
Cite
Text
Ullah and Arora. "Generalization Bounds for Kernel Canonical Correlation Analysis." Transactions on Machine Learning Research, 2023.Markdown
[Ullah and Arora. "Generalization Bounds for Kernel Canonical Correlation Analysis." Transactions on Machine Learning Research, 2023.](https://mlanthology.org/tmlr/2023/ullah2023tmlr-generalization/)BibTeX
@article{ullah2023tmlr-generalization,
title = {{Generalization Bounds for Kernel Canonical Correlation Analysis}},
author = {Ullah, Enayat and Arora, Raman},
journal = {Transactions on Machine Learning Research},
year = {2023},
url = {https://mlanthology.org/tmlr/2023/ullah2023tmlr-generalization/}
}