Logistic Normal Priors for Unsupervised Probabilistic Grammar Induction

Abstract

We explore a new Bayesian model for probabilistic grammars, a family of distributions over discrete structures that includes hidden Markov models and probabilistic context-free grammars. Our model extends the correlated topic model framework to probabilistic grammars, exploiting the logistic normal distribution as a prior over the grammar parameters. We derive a variational EM algorithm for that model, and then experiment with the task of unsupervised grammar induction for natural language dependency parsing. We show that our model achieves superior results over previous models that use different priors.

Cite

Text

Cohen et al. "Logistic Normal Priors for Unsupervised Probabilistic Grammar Induction." Neural Information Processing Systems, 2008.

Markdown

[Cohen et al. "Logistic Normal Priors for Unsupervised Probabilistic Grammar Induction." Neural Information Processing Systems, 2008.](https://mlanthology.org/neurips/2008/cohen2008neurips-logistic/)

BibTeX

@inproceedings{cohen2008neurips-logistic,
  title     = {{Logistic Normal Priors for Unsupervised Probabilistic Grammar Induction}},
  author    = {Cohen, Shay B. and Gimpel, Kevin and Smith, Noah A.},
  booktitle = {Neural Information Processing Systems},
  year      = {2008},
  pages     = {321-328},
  url       = {https://mlanthology.org/neurips/2008/cohen2008neurips-logistic/}
}