Expectation Propagation in Gaussian Process Dynamical Systems
Abstract
Rich and complex time-series data, such as those generated from engineering sys- tems, financial markets, videos or neural recordings are now a common feature of modern data analysis. Explaining the phenomena underlying these diverse data sets requires flexible and accurate models. In this paper, we promote Gaussian process dynamical systems as a rich model class appropriate for such analysis. In particular, we present a message passing algorithm for approximate inference in GPDSs based on expectation propagation. By phrasing inference as a general mes- sage passing problem, we iterate forward-backward smoothing. We obtain more accurate posterior distributions over latent structures, resulting in improved pre- dictive performance compared to state-of-the-art GPDS smoothers, which are spe- cial cases of our general iterative message passing algorithm. Hence, we provide a unifying approach within which to contextualize message passing in GPDSs.
Cite
Text
Deisenroth and Mohamed. "Expectation Propagation in Gaussian Process Dynamical Systems." Neural Information Processing Systems, 2012.Markdown
[Deisenroth and Mohamed. "Expectation Propagation in Gaussian Process Dynamical Systems." Neural Information Processing Systems, 2012.](https://mlanthology.org/neurips/2012/deisenroth2012neurips-expectation/)BibTeX
@inproceedings{deisenroth2012neurips-expectation,
title = {{Expectation Propagation in Gaussian Process Dynamical Systems}},
author = {Deisenroth, Marc and Mohamed, Shakir},
booktitle = {Neural Information Processing Systems},
year = {2012},
pages = {2609-2617},
url = {https://mlanthology.org/neurips/2012/deisenroth2012neurips-expectation/}
}