Extracting and Learning an Unknown Grammar with Recurrent Neural Networks
Abstract
Simple secood-order recurrent netwoIts are shown to readily learn sman brown regular grammars when trained with positive and negative strings examples. We show that similar methods are appropriate for learning unknown grammars from examples of their strings. TIle training algorithm is an incremental real-time, re(cid:173) current learning (RTRL) method that computes the complete gradient and updates the weights at the end of each string. After or during training. a dynamic clustering algorithm extracts the production rules that the neural network has learned.. TIle methods are illustrated by extracting rules from unknown deterministic regular grammars. For many cases the extracted grammar outperforms the neural net from which it was extracted in correctly classifying unseen strings.
Cite
Text
Giles et al. "Extracting and Learning an Unknown Grammar with Recurrent Neural Networks." Neural Information Processing Systems, 1991.Markdown
[Giles et al. "Extracting and Learning an Unknown Grammar with Recurrent Neural Networks." Neural Information Processing Systems, 1991.](https://mlanthology.org/neurips/1991/giles1991neurips-extracting/)BibTeX
@inproceedings{giles1991neurips-extracting,
title = {{Extracting and Learning an Unknown Grammar with Recurrent Neural Networks}},
author = {Giles, C. L. and Miller, C. B. and Chen, D. and Sun, G. Z. and Chen, H. H. and Lee, Y. C.},
booktitle = {Neural Information Processing Systems},
year = {1991},
pages = {317-324},
url = {https://mlanthology.org/neurips/1991/giles1991neurips-extracting/}
}