Fast Gaussian Process Posteriors with Product Trees
Abstract
Gaussian processes (GP) are a powerful tool for nonparametric regression; unfortunately, calcu-lating the posterior variance in a standard GP model requires time O(n2) in the size of the training set. Previous work by Shen et al. (2006) used a k-d tree structure to approximate the pos-terior mean in certain GP models. We extend this approach to achieve efficient approximation of the posterior covariance using a tree clustering on pairs of training points, and demonstrate sig-nificant improvements in performance with neg-ligible loss of accuracy. 1
Cite
Text
Moore and Russell. "Fast Gaussian Process Posteriors with Product Trees." Conference on Uncertainty in Artificial Intelligence, 2014.Markdown
[Moore and Russell. "Fast Gaussian Process Posteriors with Product Trees." Conference on Uncertainty in Artificial Intelligence, 2014.](https://mlanthology.org/uai/2014/moore2014uai-fast/)BibTeX
@inproceedings{moore2014uai-fast,
title = {{Fast Gaussian Process Posteriors with Product Trees}},
author = {Moore, David A. and Russell, Stuart},
booktitle = {Conference on Uncertainty in Artificial Intelligence},
year = {2014},
pages = {613-622},
url = {https://mlanthology.org/uai/2014/moore2014uai-fast/}
}