Induction of Finite-State Automata Using Second-Order Recurrent Networks
Abstract
Second-order recurrent networks that recognize simple finite state lan(cid:173) guages over 0,1* are induced from positive and negative examples. Us(cid:173) ing the complete gradient of the recurrent network and sufficient training examples to constrain the definition of the language to be induced, solu(cid:173) tions are obtained that correctly recognize strings of arbitrary length. A method for extracting a finite state automaton corresponding to an opti(cid:173) mized network is demonstrated.
Cite
Text
Watrous and Kuhn. "Induction of Finite-State Automata Using Second-Order Recurrent Networks." Neural Information Processing Systems, 1991.Markdown
[Watrous and Kuhn. "Induction of Finite-State Automata Using Second-Order Recurrent Networks." Neural Information Processing Systems, 1991.](https://mlanthology.org/neurips/1991/watrous1991neurips-induction/)BibTeX
@inproceedings{watrous1991neurips-induction,
title = {{Induction of Finite-State Automata Using Second-Order Recurrent Networks}},
author = {Watrous, Raymond L. and Kuhn, Gary M.},
booktitle = {Neural Information Processing Systems},
year = {1991},
pages = {309-317},
url = {https://mlanthology.org/neurips/1991/watrous1991neurips-induction/}
}