Scalable Gaussian Process Classification via Expectation Propagation

Abstract

Variational methods have been recently considered for scaling the training process of Gaussian process classifiers to large datasets. As an alternative, we describe here how to train these classifiers efficiently using expectation propagation. The proposed method allows for handling datasets with millions of data instances. More precisely, it can be used for (i) training in a distributed fashion where the data instances are sent to different nodes in which the required computations are carried out, and for (ii) maximizing an estimate of the marginal likelihood using a stochastic approximation of the gradient. Several experiments indicate that the method described is competitive with the variational approach.

Cite

Text

Hernández-Lobato and Hernández-Lobato. "Scalable Gaussian Process Classification via Expectation Propagation." International Conference on Artificial Intelligence and Statistics, 2016.

Markdown

[Hernández-Lobato and Hernández-Lobato. "Scalable Gaussian Process Classification via Expectation Propagation." International Conference on Artificial Intelligence and Statistics, 2016.](https://mlanthology.org/aistats/2016/hernandezlobato2016aistats-scalable/)

BibTeX

@inproceedings{hernandezlobato2016aistats-scalable,
  title     = {{Scalable Gaussian Process Classification via Expectation Propagation}},
  author    = {Hernández-Lobato, Daniel and Hernández-Lobato, José Miguel},
  booktitle = {International Conference on Artificial Intelligence and Statistics},
  year      = {2016},
  pages     = {168-176},
  url       = {https://mlanthology.org/aistats/2016/hernandezlobato2016aistats-scalable/}
}