Hierarchical Double Dirichlet Process Mixture of Gaussian Processes

Abstract

We consider an infinite mixture model of Gaussian processes that share mixture components between non-local clusters in data. Meeds and Osindero (2006) use a single Dirichlet process prior to specify a mixture of Gaussian processes using an infinite number of experts. In this paper, we extend this approach to allow for experts to be shared non-locally across the input domain. This is accomplished with a hierarchical double Dirichlet process prior, which builds upon a standard hierarchical Dirichlet process by incorporating local parameters that are unique to each cluster while sharing mixture components between them. We evaluate the model on simulated and real data, showing that sharing Gaussian process components non-locally can yield effective and useful models for richly clustered non-stationary, non-linear data.

Cite

Text

Tayal et al. "Hierarchical Double Dirichlet Process Mixture of Gaussian Processes." AAAI Conference on Artificial Intelligence, 2012. doi:10.1609/AAAI.V26I1.8309

Markdown

[Tayal et al. "Hierarchical Double Dirichlet Process Mixture of Gaussian Processes." AAAI Conference on Artificial Intelligence, 2012.](https://mlanthology.org/aaai/2012/tayal2012aaai-hierarchical/) doi:10.1609/AAAI.V26I1.8309

BibTeX

@inproceedings{tayal2012aaai-hierarchical,
  title     = {{Hierarchical Double Dirichlet Process Mixture of Gaussian Processes}},
  author    = {Tayal, Aditya and Poupart, Pascal and Li, Yuying},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2012},
  pages     = {1126-1133},
  doi       = {10.1609/AAAI.V26I1.8309},
  url       = {https://mlanthology.org/aaai/2012/tayal2012aaai-hierarchical/}
}