Structured Inference Networks for Nonlinear State Space Models
Abstract
Gaussian state space models have been used for decades as generative models of sequential data. They admit an intuitive probabilistic interpretation, have a simple functional form, and enjoy widespread adoption. We introduce a unified algorithm to efficiently learn a broad class of linear and non-linear state space models, including variants where the emission and transition distributions are modeled by deep neural networks. Our learning algorithm simultaneously learns a compiled inference network and the generative model, leveraging a structured variational approximation parameterized by recurrent neural networks to mimic the posterior distribution. We apply the learning algorithm to both synthetic and real-world datasets, demonstrating its scalability and versatility. We find that using the structured approximation to the posterior results in models with significantly higher held-out likelihood.
Cite
Text
Krishnan et al. "Structured Inference Networks for Nonlinear State Space Models." AAAI Conference on Artificial Intelligence, 2017. doi:10.1609/AAAI.V31I1.10779Markdown
[Krishnan et al. "Structured Inference Networks for Nonlinear State Space Models." AAAI Conference on Artificial Intelligence, 2017.](https://mlanthology.org/aaai/2017/krishnan2017aaai-structured/) doi:10.1609/AAAI.V31I1.10779BibTeX
@inproceedings{krishnan2017aaai-structured,
title = {{Structured Inference Networks for Nonlinear State Space Models}},
author = {Krishnan, Rahul G. and Shalit, Uri and Sontag, David A.},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2017},
pages = {2101-2109},
doi = {10.1609/AAAI.V31I1.10779},
url = {https://mlanthology.org/aaai/2017/krishnan2017aaai-structured/}
}