Asynchronous Distributed Variational Gaussian Process for Regression
Abstract
Gaussian processes (GPs) are powerful non-parametric function estimators. However, their applications are largely limited by the expensive computational cost of the inference procedures. Existing stochastic or distributed synchronous variational inferences, although have alleviated this issue by scaling up GPs to millions of samples, are still far from satisfactory for real-world large applications, where the data sizes are often orders of magnitudes larger, say, billions. To solve this problem, we propose ADVGP, the first Asynchronous Distributed Variational Gaussian Process inference for regression, on the recent large-scale machine learning platform, PARAMETER SERVER. ADVGP uses a novel, flexible variational framework based on a weight space augmentation, and implements the highly efficient, asynchronous proximal gradient optimization. While maintaining comparable or better predictive performance, ADVGP greatly improves upon the efficiency of the existing variational methods. With ADVGP, we effortlessly scale up GP regression to a real-world application with billions of samples and demonstrate an excellent, superior prediction accuracy to the popular linear models.
Cite
Text
Peng et al. "Asynchronous Distributed Variational Gaussian Process for Regression." International Conference on Machine Learning, 2017.Markdown
[Peng et al. "Asynchronous Distributed Variational Gaussian Process for Regression." International Conference on Machine Learning, 2017.](https://mlanthology.org/icml/2017/peng2017icml-asynchronous/)BibTeX
@inproceedings{peng2017icml-asynchronous,
title = {{Asynchronous Distributed Variational Gaussian Process for Regression}},
author = {Peng, Hao and Zhe, Shandian and Zhang, Xiao and Qi, Yuan},
booktitle = {International Conference on Machine Learning},
year = {2017},
pages = {2788-2797},
volume = {70},
url = {https://mlanthology.org/icml/2017/peng2017icml-asynchronous/}
}