Deep Learning with Kernels Through RKHM and the Perron-Frobenius Operator

Abstract

Reproducing kernel Hilbert $C^*$-module (RKHM) is a generalization of reproducing kernel Hilbert space (RKHS) by means of $C^*$-algebra, and the Perron-Frobenius operator is a linear operator related to the composition of functions. Combining these two concepts, we present deep RKHM, a deep learning framework for kernel methods. We derive a new Rademacher generalization bound in this setting and provide a theoretical interpretation of benign overfitting by means of Perron-Frobenius operators. By virtue of $C^*$-algebra, the dependency of the bound on output dimension is milder than existing bounds. We show that $C^*$-algebra is a suitable tool for deep learning with kernels, enabling us to take advantage of the product structure of operators and to provide a clear connection with convolutional neural networks. Our theoretical analysis provides a new lens through which one can design and analyze deep kernel methods.

Cite

Text

Hashimoto et al. "Deep Learning with Kernels Through RKHM and the Perron-Frobenius Operator." Neural Information Processing Systems, 2023.

Markdown

[Hashimoto et al. "Deep Learning with Kernels Through RKHM and the Perron-Frobenius Operator." Neural Information Processing Systems, 2023.](https://mlanthology.org/neurips/2023/hashimoto2023neurips-deep/)

BibTeX

@inproceedings{hashimoto2023neurips-deep,
  title     = {{Deep Learning with Kernels Through RKHM and the Perron-Frobenius Operator}},
  author    = {Hashimoto, Yuka and Ikeda, Masahiro and Kadri, Hachem},
  booktitle = {Neural Information Processing Systems},
  year      = {2023},
  url       = {https://mlanthology.org/neurips/2023/hashimoto2023neurips-deep/}
}