Recurrent Ladder Networks

Abstract

We propose a recurrent extension of the Ladder networks whose structure is motivated by the inference required in hierarchical latent variable models. We demonstrate that the recurrent Ladder is able to handle a wide variety of complex learning tasks that benefit from iterative inference and temporal modeling. The architecture shows close-to-optimal results on temporal modeling of video data, competitive results on music modeling, and improved perceptual grouping based on higher order abstractions, such as stochastic textures and motion cues. We present results for fully supervised, semi-supervised, and unsupervised tasks. The results suggest that the proposed architecture and principles are powerful tools for learning a hierarchy of abstractions, learning iterative inference and handling temporal information.

Cite

Text

Prémont-Schwarz et al. "Recurrent Ladder Networks." Neural Information Processing Systems, 2017.

Markdown

[Prémont-Schwarz et al. "Recurrent Ladder Networks." Neural Information Processing Systems, 2017.](https://mlanthology.org/neurips/2017/premontschwarz2017neurips-recurrent/)

BibTeX

@inproceedings{premontschwarz2017neurips-recurrent,
  title     = {{Recurrent Ladder Networks}},
  author    = {Prémont-Schwarz, Isabeau and Ilin, Alexander and Hao, Tele and Rasmus, Antti and Boney, Rinu and Valpola, Harri},
  booktitle = {Neural Information Processing Systems},
  year      = {2017},
  pages     = {6009-6019},
  url       = {https://mlanthology.org/neurips/2017/premontschwarz2017neurips-recurrent/}
}