Understanding Probabilistic Sparse Gaussian Process Approximations

Abstract

Good sparse approximations are essential for practical inference in Gaussian Processes as the computational cost of exact methods is prohibitive for large datasets. The Fully Independent Training Conditional (FITC) and the Variational Free Energy (VFE) approximations are two recent popular methods. Despite superficial similarities, these approximations have surprisingly different theoretical properties and behave differently in practice. We thoroughly investigate the two methods for regression both analytically and through illustrative examples, and draw conclusions to guide practical application.

Cite

Text

Bauer et al. "Understanding Probabilistic Sparse Gaussian Process Approximations." Neural Information Processing Systems, 2016.

Markdown

[Bauer et al. "Understanding Probabilistic Sparse Gaussian Process Approximations." Neural Information Processing Systems, 2016.](https://mlanthology.org/neurips/2016/bauer2016neurips-understanding/)

BibTeX

@inproceedings{bauer2016neurips-understanding,
  title     = {{Understanding Probabilistic Sparse Gaussian Process Approximations}},
  author    = {Bauer, Matthias and van der Wilk, Mark and Rasmussen, Carl Edward},
  booktitle = {Neural Information Processing Systems},
  year      = {2016},
  pages     = {1533-1541},
  url       = {https://mlanthology.org/neurips/2016/bauer2016neurips-understanding/}
}