Most Likely Heteroscedastic Gaussian Process Regression
Abstract
This paper presents a novel Gaussian process (GP) approach to regression with inputdependent noise rates. We follow Goldberg et al.’s approach and model the noise variance using a second GP in addition to the GP governing the noise-free output value. In contrast to Goldberg et al., however, we do not use a Markov chain Monte Carlo method to approximate the posterior noise variance but a most likely noise approach. The resulting model is easy to implement and can directly be used in combination with various existing extensions of the standard GPs such as sparse approximations. Extensive experiments on both synthetic and real-world data, including a challenging perception problem in robotics, show the effectiveness of most likely heteroscedastic GP regression.
Cite
Text
Kersting et al. "Most Likely Heteroscedastic Gaussian Process Regression." International Conference on Machine Learning, 2007. doi:10.1145/1273496.1273546Markdown
[Kersting et al. "Most Likely Heteroscedastic Gaussian Process Regression." International Conference on Machine Learning, 2007.](https://mlanthology.org/icml/2007/kersting2007icml-most/) doi:10.1145/1273496.1273546BibTeX
@inproceedings{kersting2007icml-most,
title = {{Most Likely Heteroscedastic Gaussian Process Regression}},
author = {Kersting, Kristian and Plagemann, Christian and Pfaff, Patrick and Burgard, Wolfram},
booktitle = {International Conference on Machine Learning},
year = {2007},
pages = {393-400},
doi = {10.1145/1273496.1273546},
url = {https://mlanthology.org/icml/2007/kersting2007icml-most/}
}