On Average-Case Error Bounds for Kernel-Based Bayesian Quadrature
Abstract
In this paper, we study error bounds for Bayesian quadrature (BQ), with an emphasis on noisy settings, randomized algorithms, and average-case performance measures. We seek to approximate the integral of functions in a Reproducing Kernel Hilbert Space (RKHS), particularly focusing on the Mat\'ern-$\nu$ and squared exponential (SE) kernels, with samples from the function potentially being corrupted by Gaussian noise. We provide a two-step meta-algorithm that serves as a general tool for relating the average-case quadrature error with the $L^2$-function approximation error. When specialized to the Mat\'ern kernel, we recover an existing near-optimal error rate while avoiding the existing method of repeatedly sampling points. When specialized to other settings, we obtain new average-case results for settings including the SE kernel with noise and the Mat\'ern kernel with misspecification. Finally, we present algorithm-independent lower bounds that have greater generality and/or give distinct proofs compared to existing ones.
Cite
Text
Cai et al. "On Average-Case Error Bounds for Kernel-Based Bayesian Quadrature." Transactions on Machine Learning Research, 2023.Markdown
[Cai et al. "On Average-Case Error Bounds for Kernel-Based Bayesian Quadrature." Transactions on Machine Learning Research, 2023.](https://mlanthology.org/tmlr/2023/cai2023tmlr-averagecase/)BibTeX
@article{cai2023tmlr-averagecase,
title = {{On Average-Case Error Bounds for Kernel-Based Bayesian Quadrature}},
author = {Cai, Xu and Lam, Thanh and Scarlett, Jonathan},
journal = {Transactions on Machine Learning Research},
year = {2023},
url = {https://mlanthology.org/tmlr/2023/cai2023tmlr-averagecase/}
}