A Unifying Framework of Anytime Sparse Gaussian Process Regression Models with Stochastic Variational Inference for Big Data
Abstract
This paper presents a novel unifying framework of anytime sparse Gaussian process regression (SGPR) models that can produce good predictive performance fast and improve their predictive performance over time. Our proposed unifying framework reverses the variational inference procedure to theoretically construct a non-trivial, concave functional that is maximized at the predictive distribution of any SGPR model of our choice. As a result, a stochastic natural gradient ascent method can be derived that involves iteratively following the stochastic natural gradient of the functional to improve its estimate of the predictive distribution of the chosen SGPR model and is guaranteed to achieve asymptotic convergence to it. Interestingly, we show that if the predictive distribution of the chosen SGPR model satisfies certain decomposability conditions, then the stochastic natural gradient is an unbiased estimator of the exact natural gradient and can be computed in constant time (i.e., independent of data size) at each iteration. We empirically evaluate the trade-off between the predictive performance vs. time efficiency of the anytime SGPR models on two real-world million-sized datasets.
Cite
Text
Hoang et al. "A Unifying Framework of Anytime Sparse Gaussian Process Regression Models with Stochastic Variational Inference for Big Data." International Conference on Machine Learning, 2015.Markdown
[Hoang et al. "A Unifying Framework of Anytime Sparse Gaussian Process Regression Models with Stochastic Variational Inference for Big Data." International Conference on Machine Learning, 2015.](https://mlanthology.org/icml/2015/hoang2015icml-unifying/)BibTeX
@inproceedings{hoang2015icml-unifying,
title = {{A Unifying Framework of Anytime Sparse Gaussian Process Regression Models with Stochastic Variational Inference for Big Data}},
author = {Hoang, Trong Nghia and Hoang, Quang Minh and Low, Bryan Kian Hsiang},
booktitle = {International Conference on Machine Learning},
year = {2015},
pages = {569-578},
volume = {37},
url = {https://mlanthology.org/icml/2015/hoang2015icml-unifying/}
}