Sparse Stochastic Inference for Latent Dirichlet Allocation
Abstract
We present a hybrid algorithm for Bayesian topic models that combines the efficiency of sparse Gibbs sampling with the scalability of online stochastic inference. We used our algorithm to analyze a corpus of 1.2 million books (33 billion words) with thousands of topics. Our approach reduces the bias of variational inference and generalizes to many Bayesian hidden-variable models.
Cite
Text
Mimno et al. "Sparse Stochastic Inference for Latent Dirichlet Allocation." International Conference on Machine Learning, 2012.Markdown
[Mimno et al. "Sparse Stochastic Inference for Latent Dirichlet Allocation." International Conference on Machine Learning, 2012.](https://mlanthology.org/icml/2012/mimno2012icml-sparse/)BibTeX
@inproceedings{mimno2012icml-sparse,
title = {{Sparse Stochastic Inference for Latent Dirichlet Allocation}},
author = {Mimno, David M. and Hoffman, Matthew D. and Blei, David M.},
booktitle = {International Conference on Machine Learning},
year = {2012},
url = {https://mlanthology.org/icml/2012/mimno2012icml-sparse/}
}