Scalable Bayesian Learning for State Space Models Using Variational Inference with SMC Samplers

Abstract

We present a scalable approach to performing approximate fully Bayesian inference in generic state space models. The proposed method is an alternative to particle MCMC that provides fully Bayesian inference of both the dynamic latent states and the static pa- rameters of the model. We build up on recent advances in computational statistics that combine variational methods with sequential Monte Carlo sampling and we demonstrate the advantages of performing full Bayesian inference over the static parameters rather than just performing variational EM approxima- tions. We illustrate how our approach enables scalable inference in multivariate stochastic volatility models and self-exciting point pro- cess models that allow for flexible dynamics in the latent intensity function.

Cite

Text

Hirt and Dellaportas. "Scalable Bayesian Learning for State Space Models Using Variational Inference with SMC Samplers." Artificial Intelligence and Statistics, 2019.

Markdown

[Hirt and Dellaportas. "Scalable Bayesian Learning for State Space Models Using Variational Inference with SMC Samplers." Artificial Intelligence and Statistics, 2019.](https://mlanthology.org/aistats/2019/hirt2019aistats-scalable/)

BibTeX

@inproceedings{hirt2019aistats-scalable,
  title     = {{Scalable Bayesian Learning for State Space Models Using Variational Inference with SMC Samplers}},
  author    = {Hirt, Marcel and Dellaportas, Petros},
  booktitle = {Artificial Intelligence and Statistics},
  year      = {2019},
  pages     = {76-86},
  volume    = {89},
  url       = {https://mlanthology.org/aistats/2019/hirt2019aistats-scalable/}
}