Combinatorial Topic Models Using Small-Variance Asymptotics
Abstract
Topic models have emerged as fundamental tools in unsupervised machine learning. Most modern topic modeling algorithms take a probabilistic view and derive inference algorithms based on Latent Dirichlet Allocation (LDA) or its variants. In contrast, we study topic modeling as a combinatorial optimization problem, and propose a new objective function derived from LDA by passing to the small-variance limit. We minimize the derived objective by using ideas from combinatorial optimization, which results in a new, fast, and high-quality topic modeling algorithm. In particular, we show that our results are competitive with popular LDA-based topic modeling approaches, and also discuss the (dis)similarities between our approach and its probabilistic counterparts.
Cite
Text
Jiang et al. "Combinatorial Topic Models Using Small-Variance Asymptotics." International Conference on Artificial Intelligence and Statistics, 2017.Markdown
[Jiang et al. "Combinatorial Topic Models Using Small-Variance Asymptotics." International Conference on Artificial Intelligence and Statistics, 2017.](https://mlanthology.org/aistats/2017/jiang2017aistats-combinatorial/)BibTeX
@inproceedings{jiang2017aistats-combinatorial,
title = {{Combinatorial Topic Models Using Small-Variance Asymptotics}},
author = {Jiang, Ke and Sra, Suvrit and Kulis, Brian},
booktitle = {International Conference on Artificial Intelligence and Statistics},
year = {2017},
pages = {421-429},
url = {https://mlanthology.org/aistats/2017/jiang2017aistats-combinatorial/}
}