Infinite Mixtures of Gaussian Process Experts
Abstract
We present an extension to the Mixture of Experts (ME) model, where the individual experts are Gaussian Process (GP) regression models. Us- ing an input-dependent adaptation of the Dirichlet Process, we imple- ment a gating network for an infinite number of Experts. Inference in this model may be done efficiently using a Markov Chain relying on Gibbs sampling. The model allows the effective covariance function to vary with the inputs, and may handle large datasets – thus potentially over- coming two of the biggest hurdles with GP models. Simulations show the viability of this approach.
Cite
Text
Rasmussen and Ghahramani. "Infinite Mixtures of Gaussian Process Experts." Neural Information Processing Systems, 2001.Markdown
[Rasmussen and Ghahramani. "Infinite Mixtures of Gaussian Process Experts." Neural Information Processing Systems, 2001.](https://mlanthology.org/neurips/2001/rasmussen2001neurips-infinite/)BibTeX
@inproceedings{rasmussen2001neurips-infinite,
title = {{Infinite Mixtures of Gaussian Process Experts}},
author = {Rasmussen, Carl E. and Ghahramani, Zoubin},
booktitle = {Neural Information Processing Systems},
year = {2001},
pages = {881-888},
url = {https://mlanthology.org/neurips/2001/rasmussen2001neurips-infinite/}
}