A Recurrent Neural Network Without Chaos
Abstract
We introduce an exceptionally simple gated recurrent neural network (RNN) that achieves performance comparable to well-known gated architectures, such as LSTMs and GRUs, on the word-level language modeling task. We prove that our model has simple, predicable and non-chaotic dynamics. This stands in stark contrast to more standard gated architectures, whose underlying dynamical systems exhibit chaotic behavior.
Cite
Text
Laurent and von Brecht. "A Recurrent Neural Network Without Chaos." International Conference on Learning Representations, 2017.Markdown
[Laurent and von Brecht. "A Recurrent Neural Network Without Chaos." International Conference on Learning Representations, 2017.](https://mlanthology.org/iclr/2017/laurent2017iclr-recurrent/)BibTeX
@inproceedings{laurent2017iclr-recurrent,
title = {{A Recurrent Neural Network Without Chaos}},
author = {Laurent, Thomas and von Brecht, James},
booktitle = {International Conference on Learning Representations},
year = {2017},
url = {https://mlanthology.org/iclr/2017/laurent2017iclr-recurrent/}
}