Gaussian Process Quantile Regression Using Expectation Propagation
Abstract
Direct quantile regression involves estimating a given quantile of a response variable as a function of input variables. We present a new framework for direct quantile regression where a Gaussian process model is learned, minimising the expected tilted loss function. The integration required in learning is not analytically tractable so to speed up the learning we employ the Expectation Propagation algorithm. We describe how this work relates to other quantile regression methods and apply the method on both synthetic and real data sets. The method is shown to be competitive with state of the art methods whilst allowing for the leverage of the full Gaussian process probabilistic framework.
Cite
Text
Boukouvalas et al. "Gaussian Process Quantile Regression Using Expectation Propagation." International Conference on Machine Learning, 2012.Markdown
[Boukouvalas et al. "Gaussian Process Quantile Regression Using Expectation Propagation." International Conference on Machine Learning, 2012.](https://mlanthology.org/icml/2012/boukouvalas2012icml-gaussian/)BibTeX
@inproceedings{boukouvalas2012icml-gaussian,
title = {{Gaussian Process Quantile Regression Using Expectation Propagation}},
author = {Boukouvalas, Alexis and Barillec, Remi Louis and Cornford, Dan},
booktitle = {International Conference on Machine Learning},
year = {2012},
url = {https://mlanthology.org/icml/2012/boukouvalas2012icml-gaussian/}
}