Contraction Rates for Sparse Variational Approximations in Gaussian Process Regression

Abstract

We study the theoretical properties of a variational Bayes method in the Gaussian Process regression model. We consider the inducing variables method and derive sufficient conditions for obtaining contraction rates for the corresponding variational Bayes (VB) posterior. As examples we show that for three particular covariance kernels (Matérn, squared exponential, random series prior) the VB approach can achieve optimal, minimax contraction rates for a sufficiently large number of appropriately chosen inducing variables. The theoretical findings are demonstrated by numerical experiments.

Cite

Text

Nieman et al. "Contraction Rates for Sparse Variational Approximations in Gaussian Process Regression." Journal of Machine Learning Research, 2022.

Markdown

[Nieman et al. "Contraction Rates for Sparse Variational Approximations in Gaussian Process Regression." Journal of Machine Learning Research, 2022.](https://mlanthology.org/jmlr/2022/nieman2022jmlr-contraction/)

BibTeX

@article{nieman2022jmlr-contraction,
  title     = {{Contraction Rates for Sparse Variational Approximations in Gaussian Process Regression}},
  author    = {Nieman, Dennis and Szabo, Botond and van Zanten, Harry},
  journal   = {Journal of Machine Learning Research},
  year      = {2022},
  pages     = {1-26},
  volume    = {23},
  url       = {https://mlanthology.org/jmlr/2022/nieman2022jmlr-contraction/}
}