Better Generative Models for Sequential Data Problems: Bidirectional Recurrent Mixture Density Networks

Abstract

This paper describes bidirectional recurrent mixture density net(cid:173) works, which can model multi-modal distributions of the type P(Xt Iyf) and P(Xt lXI, X2 , ... ,Xt-l, yf) without any explicit as(cid:173) sumptions about the use of context . These expressions occur fre(cid:173) quently in pattern recognition problems with sequential data, for example in speech recognition. Experiments show that the pro(cid:173) posed generative models give a higher likelihood on test data com(cid:173) pared to a traditional modeling approach, indicating that they can summarize the statistical properties of the data better.

Cite

Text

Schuster. "Better Generative Models for Sequential Data Problems: Bidirectional Recurrent Mixture Density Networks." Neural Information Processing Systems, 1999.

Markdown

[Schuster. "Better Generative Models for Sequential Data Problems: Bidirectional Recurrent Mixture Density Networks." Neural Information Processing Systems, 1999.](https://mlanthology.org/neurips/1999/schuster1999neurips-better/)

BibTeX

@inproceedings{schuster1999neurips-better,
  title     = {{Better Generative Models for Sequential Data Problems: Bidirectional Recurrent Mixture Density Networks}},
  author    = {Schuster, Mike},
  booktitle = {Neural Information Processing Systems},
  year      = {1999},
  pages     = {589-595},
  url       = {https://mlanthology.org/neurips/1999/schuster1999neurips-better/}
}