Kernel Autocovariance Operators of Stationary Processes: Estimation and Convergence
Abstract
We consider autocovariance operators of a stationary stochastic process on a Polish space that is embedded into a reproducing kernel Hilbert space. We investigate how empirical estimates of these operators converge along realizations of the process under various conditions. In particular, we examine ergodic and strongly mixing processes and obtain several asymptotic results as well as finite sample error bounds. We provide applications of our theory in terms of consistency results for kernel PCA with dependent data and the conditional mean embedding of transition probabilities. Finally, we use our approach to examine the nonparametric estimation of Markov transition operators and highlight how our theory can give a consistency analysis for a large family of spectral analysis methods including kernel-based dynamic mode decomposition.
Cite
Text
Mollenhauer et al. "Kernel Autocovariance Operators of Stationary Processes: Estimation and Convergence." Journal of Machine Learning Research, 2022.Markdown
[Mollenhauer et al. "Kernel Autocovariance Operators of Stationary Processes: Estimation and Convergence." Journal of Machine Learning Research, 2022.](https://mlanthology.org/jmlr/2022/mollenhauer2022jmlr-kernel/)BibTeX
@article{mollenhauer2022jmlr-kernel,
title = {{Kernel Autocovariance Operators of Stationary Processes: Estimation and Convergence}},
author = {Mollenhauer, Mattes and Klus, Stefan and Schütte, Christof and Koltai, Péter},
journal = {Journal of Machine Learning Research},
year = {2022},
pages = {1-34},
volume = {23},
url = {https://mlanthology.org/jmlr/2022/mollenhauer2022jmlr-kernel/}
}