Variational Inference for Mahalanobis Distance Metrics in Gaussian Process Regression

Abstract

We introduce a novel variational method that allows to approximately integrate out kernel hyperparameters, such as length-scales, in Gaussian process regression. This approach consists of a novel variant of the variational framework that has been recently developed for the Gaussian process latent variable model which additionally makes use of a standardised representation of the Gaussian process. We consider this technique for learning Mahalanobis distance metrics in a Gaussian process regression setting and provide experimental evaluations and comparisons with existing methods by considering datasets with high-dimensional inputs.

Cite

Text

Aueb and Lazaro-Gredilla. "Variational Inference for Mahalanobis Distance Metrics in Gaussian Process Regression." Neural Information Processing Systems, 2013.

Markdown

[Aueb and Lazaro-Gredilla. "Variational Inference for Mahalanobis Distance Metrics in Gaussian Process Regression." Neural Information Processing Systems, 2013.](https://mlanthology.org/neurips/2013/aueb2013neurips-variational/)

BibTeX

@inproceedings{aueb2013neurips-variational,
  title     = {{Variational Inference for Mahalanobis Distance Metrics in Gaussian Process Regression}},
  author    = {Aueb, Michalis Titsias RC and Lazaro-Gredilla, Miguel},
  booktitle = {Neural Information Processing Systems},
  year      = {2013},
  pages     = {279-287},
  url       = {https://mlanthology.org/neurips/2013/aueb2013neurips-variational/}
}