Product Trees for Gaussian Process Covariance in Sublinear Time

Abstract

Gaussian process (GP) regression is a powerful technique for nonparametric regression; unfortunately, calculating the predictive variance in a standard GP model requires time O(n2) in the size of the training set. This is cost prohibitive when GP likelihood calculations must be done in the inner loop of the inference procedure for a larger model (e.g., MCMC). Previous work by Shen et al. (2006) used a k-d tree structure to approximate the predictive mean in certain GP models. We extend this approach to achieve efficient approximation of the predictive covariance using a tree clustering on pairs of training points. We show empirically that this significantly increases performance at minimal cost in accuracy. Additionally, we apply our method to “primal/dual ” models having both parametric and nonparametric components and show that this enables efficient computations even while modeling longer-scale variation. 1

Cite

Text

Moore and Russell. "Product Trees for Gaussian Process Covariance in Sublinear Time." Conference on Uncertainty in Artificial Intelligence, 2013.

Markdown

[Moore and Russell. "Product Trees for Gaussian Process Covariance in Sublinear Time." Conference on Uncertainty in Artificial Intelligence, 2013.](https://mlanthology.org/uai/2013/moore2013uai-product/)

BibTeX

@inproceedings{moore2013uai-product,
  title     = {{Product Trees for Gaussian Process Covariance in Sublinear Time}},
  author    = {Moore, David A. and Russell, Stuart},
  booktitle = {Conference on Uncertainty in Artificial Intelligence},
  year      = {2013},
  pages     = {58-66},
  url       = {https://mlanthology.org/uai/2013/moore2013uai-product/}
}