Learning Nonregular Languages: A Comparison of Simple Recurrent Networks and LSTM
Abstract
In response to Rodriguez's recent article (2001), we compare the performance of simple recurrent nets and long short-term memory recurrent nets on context-free and context-sensitive languages.
Cite
Text
Schmidhuber et al. "Learning Nonregular Languages: A Comparison of Simple Recurrent Networks and LSTM." Neural Computation, 2002. doi:10.1162/089976602320263980Markdown
[Schmidhuber et al. "Learning Nonregular Languages: A Comparison of Simple Recurrent Networks and LSTM." Neural Computation, 2002.](https://mlanthology.org/neco/2002/schmidhuber2002neco-learning/) doi:10.1162/089976602320263980BibTeX
@article{schmidhuber2002neco-learning,
title = {{Learning Nonregular Languages: A Comparison of Simple Recurrent Networks and LSTM}},
author = {Schmidhuber, Jürgen and Gers, Felix A. and Eck, Douglas},
journal = {Neural Computation},
year = {2002},
pages = {2039-2041},
doi = {10.1162/089976602320263980},
volume = {14},
url = {https://mlanthology.org/neco/2002/schmidhuber2002neco-learning/}
}