Learning Context-Free Grammars from Structural Data in Polynomial Time

Abstract

We consider the problem of learning a context-free grammar from its structural descriptions. Structural descriptions of a context-free grammar are unlabelled derivation trees of the grammar. We present an efficient algorithm for learning context-free grammars using two types of queries: structural equivalence queries and structural membership queries. The learning protocol is based on what is called “minimally adequate teacher”, and it is shown that a grammar learned by the algorithm is not only a correct grammar, i.e. equivalent to the unknown grammar but also structurally equivalent to it. Furthermore, the algorithm runs in time polynomial in the number of states of the minimum frontier-to-root tree automaton for the set of structural descriptions of the unknown grammar and the maximum size of any counter-example returned by a structural equivalence query.

Cite

Text

Sakakibara. "Learning Context-Free Grammars from Structural Data in Polynomial Time." Annual Conference on Computational Learning Theory, 1988. doi:10.1016/0304-3975(90)90017-C

Markdown

[Sakakibara. "Learning Context-Free Grammars from Structural Data in Polynomial Time." Annual Conference on Computational Learning Theory, 1988.](https://mlanthology.org/colt/1988/sakakibara1988colt-learning/) doi:10.1016/0304-3975(90)90017-C

BibTeX

@inproceedings{sakakibara1988colt-learning,
  title     = {{Learning Context-Free Grammars from Structural Data in Polynomial Time}},
  author    = {Sakakibara, Yasubumi},
  booktitle = {Annual Conference on Computational Learning Theory},
  year      = {1988},
  pages     = {330-344},
  doi       = {10.1016/0304-3975(90)90017-C},
  url       = {https://mlanthology.org/colt/1988/sakakibara1988colt-learning/}
}