Higher Order Recurrent Networks and Grammatical Inference
Abstract
A higher order single layer recursive network easily learns to simulate a deterministic finite state machine and recognize regular grammars. When an enhanced version of this neural net state machine is connected through a common error term to an external analog stack memory, the combination can be interpreted as a neural net pushdown automata. The neural net finite state machine is given the primitives, push and POP. and is able to read the top of the stack. Through a gradient descent learning rule derived from the common error function, the hybrid network learns to effectively use the stack actions to manipUlate the stack memory and to learn simple context(cid:173) free grammars. INTRODUCTION
Cite
Text
Giles et al. "Higher Order Recurrent Networks and Grammatical Inference." Neural Information Processing Systems, 1989.Markdown
[Giles et al. "Higher Order Recurrent Networks and Grammatical Inference." Neural Information Processing Systems, 1989.](https://mlanthology.org/neurips/1989/giles1989neurips-higher/)BibTeX
@inproceedings{giles1989neurips-higher,
title = {{Higher Order Recurrent Networks and Grammatical Inference}},
author = {Giles, C. Lee and Sun, Guo-Zheng and Chen, Hsing-Hen and Lee, Yee-Chun and Chen, Dong},
booktitle = {Neural Information Processing Systems},
year = {1989},
pages = {380-387},
url = {https://mlanthology.org/neurips/1989/giles1989neurips-higher/}
}