A Recurrent Latent Variable Model for Sequential Data

Abstract

In this paper, we explore the inclusion of latent random variables into the hidden state of a recurrent neural network (RNN) by combining the elements of the variational autoencoder. We argue that through the use of high-level latent random variables, the variational RNN (VRNN) can model the kind of variability observed in highly structured sequential data such as natural speech. We empirically evaluate the proposed model against other related sequential models on four speech datasets and one handwriting dataset. Our results show the important roles that latent random variables can play in the RNN dynamics.

Cite

Text

Chung et al. "A Recurrent Latent Variable Model for Sequential Data." Neural Information Processing Systems, 2015.

Markdown

[Chung et al. "A Recurrent Latent Variable Model for Sequential Data." Neural Information Processing Systems, 2015.](https://mlanthology.org/neurips/2015/chung2015neurips-recurrent/)

BibTeX

@inproceedings{chung2015neurips-recurrent,
  title     = {{A Recurrent Latent Variable Model for Sequential Data}},
  author    = {Chung, Junyoung and Kastner, Kyle and Dinh, Laurent and Goel, Kratarth and Courville, Aaron C. and Bengio, Yoshua},
  booktitle = {Neural Information Processing Systems},
  year      = {2015},
  pages     = {2980-2988},
  url       = {https://mlanthology.org/neurips/2015/chung2015neurips-recurrent/}
}