Transforming Gaussian Processes with Normalizing Flows

Abstract

Gaussian Processes (GP) can be used as flexible, non-parametric function priors. Inspired by the growing body of work on Normalizing Flows, we enlarge this class of priors through a parametric invertible transformation that can be made input-dependent. Doing so also allows us to encode interpretable prior knowledge (e.g., boundedness constraints). We derive a variational approximation to the resulting Bayesian inference problem, which is as fast as stochastic variational GP regression (Hensman et al., 2013; Dezfouli and Bonilla, 2015). This makes the model a computationally efficient alternative to other hierarchical extensions of GP priors (Lázaro-Gredilla,2012; Damianou and Lawrence,2013). The resulting algorithm’s computational and inferential performance is excellent, and we demonstrate this on a range of data sets. For example, even with only 5 inducing points and an input-dependent flow, our method is consistently competitive with a standard sparse GP fitted using 100 inducing points.

Cite

Text

Maroñas et al. "Transforming  Gaussian Processes with Normalizing Flows." Artificial Intelligence and Statistics, 2021.

Markdown

[Maroñas et al. "Transforming  Gaussian Processes with Normalizing Flows." Artificial Intelligence and Statistics, 2021.](https://mlanthology.org/aistats/2021/maronas2021aistats-transforming/)

BibTeX

@inproceedings{maronas2021aistats-transforming,
  title     = {{Transforming  Gaussian Processes with Normalizing Flows}},
  author    = {Maroñas, Juan and Hamelijnck, Oliver and Knoblauch, Jeremias and Damoulas, Theodoros},
  booktitle = {Artificial Intelligence and Statistics},
  year      = {2021},
  pages     = {1081-1089},
  volume    = {130},
  url       = {https://mlanthology.org/aistats/2021/maronas2021aistats-transforming/}
}