Marginalised Gaussian Processes with Nested Sampling
Abstract
Gaussian Process models are a rich distribution over functions with inductive biases controlled by a kernel function. Learning occurs through optimisation of the kernel hyperparameters using the marginal likelihood as the objective. This work proposes nested sampling as a means of marginalising kernel hyperparameters, because it is a technique that is well-suited to exploring complex, multi-modal distributions. We benchmark against Hamiltonian Monte Carlo on time-series and two-dimensional regression tasks, finding that a principled approach to quantifying hyperparameter uncertainty substantially improves the quality of prediction intervals.
Cite
Text
Simpson et al. "Marginalised Gaussian Processes with Nested Sampling." Neural Information Processing Systems, 2021.Markdown
[Simpson et al. "Marginalised Gaussian Processes with Nested Sampling." Neural Information Processing Systems, 2021.](https://mlanthology.org/neurips/2021/simpson2021neurips-marginalised/)BibTeX
@inproceedings{simpson2021neurips-marginalised,
title = {{Marginalised Gaussian Processes with Nested Sampling}},
author = {Simpson, Fergus and Lalchand, Vidhi and Rasmussen, Carl Edward},
booktitle = {Neural Information Processing Systems},
year = {2021},
url = {https://mlanthology.org/neurips/2021/simpson2021neurips-marginalised/}
}