Kernel Interpolation for Scalable Online Gaussian Processes

Abstract

Gaussian processes (GPs) provide a gold standard for performance in online settings, such as sample-efficient control and black box optimization, where we need to update a posterior distribution as we acquire data in a sequential online setting. However, updating a GP posterior to accommodate even a single new observation after having observed $n$ points incurs at least $\mathcal{O}(n)$ computations in the exact setting. We show how to use structured kernel interpolation to efficiently reuse computations for constant-time $\mathcal{O}(1)$ online updates with respect to the number of points $n$, while retaining exact inference. We demonstrate the promise of our approach in a range of online regression and classification settings, Bayesian optimization, and active sampling to reduce error in malaria incidence forecasting.

Cite

Text

Stanton et al. "Kernel Interpolation for Scalable Online Gaussian Processes." Artificial Intelligence and Statistics, 2021.

Markdown

[Stanton et al. "Kernel Interpolation for Scalable Online Gaussian Processes." Artificial Intelligence and Statistics, 2021.](https://mlanthology.org/aistats/2021/stanton2021aistats-kernel/)

BibTeX

@inproceedings{stanton2021aistats-kernel,
  title     = {{Kernel Interpolation for Scalable Online Gaussian Processes}},
  author    = {Stanton, Samuel and Maddox, Wesley and Delbridge, Ian and Gordon Wilson, Andrew},
  booktitle = {Artificial Intelligence and Statistics},
  year      = {2021},
  pages     = {3133-3141},
  volume    = {130},
  url       = {https://mlanthology.org/aistats/2021/stanton2021aistats-kernel/}
}