Incorporating Domain Knowledge into Topic Modeling via Dirichlet Forest Priors
Abstract
Users of topic modeling methods often have knowledge about the composition of words that should have high or low probability in various topics. We incorporate such domain knowledge using a novel Dirichlet forest prior in a Latent Dirichlet Allocation framework. The prior is a mixture of Dirichlet tree distributions with special structures. We present its construction, and inference via collapsed Gibbs sampling. Experiments on synthetic and real datasets demonstrate our model's ability to follow and generalize beyond user-specified domain knowledge.
Cite
Text
Andrzejewski et al. "Incorporating Domain Knowledge into Topic Modeling via Dirichlet Forest Priors." International Conference on Machine Learning, 2009. doi:10.1145/1553374.1553378Markdown
[Andrzejewski et al. "Incorporating Domain Knowledge into Topic Modeling via Dirichlet Forest Priors." International Conference on Machine Learning, 2009.](https://mlanthology.org/icml/2009/andrzejewski2009icml-incorporating/) doi:10.1145/1553374.1553378BibTeX
@inproceedings{andrzejewski2009icml-incorporating,
title = {{Incorporating Domain Knowledge into Topic Modeling via Dirichlet Forest Priors}},
author = {Andrzejewski, David and Zhu, Xiaojin and Craven, Mark},
booktitle = {International Conference on Machine Learning},
year = {2009},
pages = {25-32},
doi = {10.1145/1553374.1553378},
url = {https://mlanthology.org/icml/2009/andrzejewski2009icml-incorporating/}
}