Barely Biased Learning for Gaussian Process Regression
Abstract
Recent work in scalable approximate Gaussian process regression has discussed a bias-variance-computation trade-off when estimating the log marginal likelihood. We suggest a method that adaptively selects the amount of computation to use when estimating the log marginal likelihood so that the bias of the objective function is guaranteed to be small. While in principle a simple modification of existing approximations, our current implementation of the method is not computationally competitive with these existing approximations, limiting its applicability.
Cite
Text
Burt et al. "Barely Biased Learning for Gaussian Process Regression." NeurIPS 2021 Workshops: ICBINB, 2021.Markdown
[Burt et al. "Barely Biased Learning for Gaussian Process Regression." NeurIPS 2021 Workshops: ICBINB, 2021.](https://mlanthology.org/neuripsw/2021/burt2021neuripsw-barely/)BibTeX
@inproceedings{burt2021neuripsw-barely,
title = {{Barely Biased Learning for Gaussian Process Regression}},
author = {Burt, David R. and Artemev, Artem and van der Wilk, Mark},
booktitle = {NeurIPS 2021 Workshops: ICBINB},
year = {2021},
url = {https://mlanthology.org/neuripsw/2021/burt2021neuripsw-barely/}
}