Integrating Topics and Syntax
Abstract
Statistical approaches to language learning typically focus on either short-range syntactic dependencies or long-range semantic dependencies between words. We present a generative model that uses both kinds of dependencies, and can be used to simultaneously find syntactic classes and semantic topics despite having no representation of syntax or seman- tics beyond statistical dependency. This model is competitive on tasks like part-of-speech tagging and document classification with models that exclusively use short- and long-range dependencies respectively.
Cite
Text
Griffiths et al. "Integrating Topics and Syntax." Neural Information Processing Systems, 2004.Markdown
[Griffiths et al. "Integrating Topics and Syntax." Neural Information Processing Systems, 2004.](https://mlanthology.org/neurips/2004/griffiths2004neurips-integrating/)BibTeX
@inproceedings{griffiths2004neurips-integrating,
title = {{Integrating Topics and Syntax}},
author = {Griffiths, Thomas L. and Steyvers, Mark and Blei, David M. and Tenenbaum, Joshua B.},
booktitle = {Neural Information Processing Systems},
year = {2004},
pages = {537-544},
url = {https://mlanthology.org/neurips/2004/griffiths2004neurips-integrating/}
}