Scalable Gaussian Processes with Grid-Structured Eigenfunctions (GP-GRIEF)
Abstract
We introduce a kernel approximation strategy that enables computation of the Gaussian process log marginal likelihood and all hyperparameter derivatives in O(p) time. Our GRIEF kernel consists of p eigenfunctions found using a Nystrom approximation from a dense Cartesian product grid of inducing points. By exploiting algebraic properties of Kronecker and Khatri-Rao tensor products, computational complexity of the training procedure can be practically independent of the number of inducing points. This allows us to use arbitrarily many inducing points to achieve a globally accurate kernel approximation, even in high-dimensional problems. The fast likelihood evaluation enables type-I or II Bayesian inference on large-scale datasets. We benchmark our algorithms on real-world problems with up to two-million training points and 10^33 inducing points.
Cite
Text
Evans and Nair. "Scalable Gaussian Processes with Grid-Structured Eigenfunctions (GP-GRIEF)." International Conference on Machine Learning, 2018.Markdown
[Evans and Nair. "Scalable Gaussian Processes with Grid-Structured Eigenfunctions (GP-GRIEF)." International Conference on Machine Learning, 2018.](https://mlanthology.org/icml/2018/evans2018icml-scalable/)BibTeX
@inproceedings{evans2018icml-scalable,
title = {{Scalable Gaussian Processes with Grid-Structured Eigenfunctions (GP-GRIEF)}},
author = {Evans, Trefor and Nair, Prasanth},
booktitle = {International Conference on Machine Learning},
year = {2018},
pages = {1417-1426},
volume = {80},
url = {https://mlanthology.org/icml/2018/evans2018icml-scalable/}
}