Marginalising over Stationary Kernels with Bayesian Quadrature

Abstract

Marginalising over families of Gaussian Process kernels produces flexible model classes with well-calibrated uncertainty estimates. Existing approaches require likelihood evaluations of many kernels, rendering them prohibitively expensive for larger datasets. We propose a Bayesian Quadrature scheme to make this marginalisation more efficient and thereby more practical. Through use of maximum mean discrepancies between distributions, we define a kernel over kernels that captures invariances between Spectral Mixture (SM) Kernels. Kernel samples are selected by generalising an information-theoretic acquisition function for warped Bayesian Quadrature. We show that our framework achieves more accurate predictions with better calibrated uncertainty than state-of-the-art baselines, especially when given limited (wall-clock) time budgets.

Cite

Text

Hamid et al. "Marginalising over Stationary Kernels with Bayesian Quadrature." Artificial Intelligence and Statistics, 2022.

Markdown

[Hamid et al. "Marginalising over Stationary Kernels with Bayesian Quadrature." Artificial Intelligence and Statistics, 2022.](https://mlanthology.org/aistats/2022/hamid2022aistats-marginalising/)

BibTeX

@inproceedings{hamid2022aistats-marginalising,
  title     = {{Marginalising over Stationary Kernels with Bayesian Quadrature}},
  author    = {Hamid, Saad and Schulze, Sebastian and Osborne, Michael A. and Roberts, Stephen},
  booktitle = {Artificial Intelligence and Statistics},
  year      = {2022},
  pages     = {9776-9792},
  volume    = {151},
  url       = {https://mlanthology.org/aistats/2022/hamid2022aistats-marginalising/}
}