Variational Temporal Abstraction

Abstract

We introduce a variational approach to learning and inference of temporally hierarchical structure and representation for sequential data. We propose the Variational Temporal Abstraction (VTA), a hierarchical recurrent state space model that can infer the latent temporal structure and thus perform the stochastic state transition hierarchically. We also propose to apply this model to implement the jumpy imagination ability in imagination-augmented agent-learning in order to improve the efficiency of the imagination. In experiments, we demonstrate that our proposed method can model 2D and 3D visual sequence datasets with interpretable temporal structure discovery and that its application to jumpy imagination enables more efficient agent-learning in a 3D navigation task.

Cite

Text

Kim et al. "Variational Temporal Abstraction." Neural Information Processing Systems, 2019.

Markdown

[Kim et al. "Variational Temporal Abstraction." Neural Information Processing Systems, 2019.](https://mlanthology.org/neurips/2019/kim2019neurips-variational/)

BibTeX

@inproceedings{kim2019neurips-variational,
  title     = {{Variational Temporal Abstraction}},
  author    = {Kim, Taesup and Ahn, Sungjin and Bengio, Yoshua},
  booktitle = {Neural Information Processing Systems},
  year      = {2019},
  pages     = {11570-11579},
  url       = {https://mlanthology.org/neurips/2019/kim2019neurips-variational/}
}