Variational Heteroscedastic Gaussian Process Regression

Abstract

Standard Gaussian processes (GPs) model observations' noise as constant throughout input space. This is often a too restrictive assumption, but one that is needed for GP inference to be tractable. In this work we present a non-standard variational approximation that allows accurate inference in heteroscedastic GPs (i.e., under input-dependent noise conditions). Computational cost is roughly twice that of the standard GP, and also scales as O(n^3). Accuracy is verified by comparing with the golden standard MCMC and its effectiveness is illustrated on several synthetic and real datasets of diverse characteristics. An application to volatility forecasting is also considered.

Cite

Text

Lázaro-Gredilla and Titsias. "Variational Heteroscedastic Gaussian Process Regression." International Conference on Machine Learning, 2011.

Markdown

[Lázaro-Gredilla and Titsias. "Variational Heteroscedastic Gaussian Process Regression." International Conference on Machine Learning, 2011.](https://mlanthology.org/icml/2011/lazarogredilla2011icml-variational/)

BibTeX

@inproceedings{lazarogredilla2011icml-variational,
  title     = {{Variational Heteroscedastic Gaussian Process Regression}},
  author    = {Lázaro-Gredilla, Miguel and Titsias, Michalis K.},
  booktitle = {International Conference on Machine Learning},
  year      = {2011},
  pages     = {841-848},
  url       = {https://mlanthology.org/icml/2011/lazarogredilla2011icml-variational/}
}