Statistical Parsing with Probabilistic Symbol-Refined Tree Substitution Grammars
Abstract
We present probabilistic Symbol-Refined Tree Substitution Grammars (SR-TSG) for statistical parsing of natural language sentences. An SR-TSG is an extension of the conventional TSG model where each nonterminal symbol can be refined (subcategorized) to fit the training data. Our probabilistic model is consistent based on the hierarchical Pitman-Yor Process to encode backoff smoothing from a fine-grained SR-TSG to simpler CFG rules, thus all grammar rules can be learned from training data in a fully automatic fashion. Our SR-TSG parser achieves the state-of-the-art performance on the Wall Street Journal (WSJ) English Penn Treebank data.
Cite
Text
Shindo et al. "Statistical Parsing with Probabilistic Symbol-Refined Tree Substitution Grammars." International Joint Conference on Artificial Intelligence, 2013.Markdown
[Shindo et al. "Statistical Parsing with Probabilistic Symbol-Refined Tree Substitution Grammars." International Joint Conference on Artificial Intelligence, 2013.](https://mlanthology.org/ijcai/2013/shindo2013ijcai-statistical/)BibTeX
@inproceedings{shindo2013ijcai-statistical,
title = {{Statistical Parsing with Probabilistic Symbol-Refined Tree Substitution Grammars}},
author = {Shindo, Hiroyuki and Miyao, Yusuke and Fujino, Akinori and Nagata, Masaaki},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2013},
pages = {3082-3086},
url = {https://mlanthology.org/ijcai/2013/shindo2013ijcai-statistical/}
}