Warped Gaussian Processes

Abstract

We generalise the Gaussian process (GP) framework for regression by learning a nonlinear transformation of the GP outputs. This allows for non-Gaussian processes and non-Gaussian noise. The learning algo- rithm chooses a nonlinear transformation such that transformed data is well-modelled by a GP. This can be seen as including a preprocessing transformation as an integral part of the probabilistic modelling problem, rather than as an ad-hoc step. We demonstrate on several real regression problems that learning the transformation can lead to significantly better performance than using a regular GP, or a GP with a fixed transformation.

Cite

Text

Snelson et al. "Warped Gaussian Processes." Neural Information Processing Systems, 2003.

Markdown

[Snelson et al. "Warped Gaussian Processes." Neural Information Processing Systems, 2003.](https://mlanthology.org/neurips/2003/snelson2003neurips-warped/)

BibTeX

@inproceedings{snelson2003neurips-warped,
  title     = {{Warped Gaussian Processes}},
  author    = {Snelson, Edward and Ghahramani, Zoubin and Rasmussen, Carl E.},
  booktitle = {Neural Information Processing Systems},
  year      = {2003},
  pages     = {337-344},
  url       = {https://mlanthology.org/neurips/2003/snelson2003neurips-warped/}
}