Scalable MCMC for Mixed Membership Stochastic Blockmodels
Abstract
We propose a stochastic gradient Markov chain Monte Carlo (SG-MCMC) algorithm for scalable inference in mixed-membership stochastic blockmodels (MMSB). Our algorithm is based on the stochastic gradient Riemannian Langevin sampler and achieves both faster speed and higher accuracy at every iteration than the current state-of-the-art algorithm based on stochastic variational inference. In addition we develop an approximation that can handle models that entertain a very large number of communities. The experimental results show that SG-MCMC strictly dominates competing algorithms in all cases.
Cite
Text
Li et al. "Scalable MCMC for Mixed Membership Stochastic Blockmodels." International Conference on Artificial Intelligence and Statistics, 2016.Markdown
[Li et al. "Scalable MCMC for Mixed Membership Stochastic Blockmodels." International Conference on Artificial Intelligence and Statistics, 2016.](https://mlanthology.org/aistats/2016/li2016aistats-scalable/)BibTeX
@inproceedings{li2016aistats-scalable,
title = {{Scalable MCMC for Mixed Membership Stochastic Blockmodels}},
author = {Li, Wenzhe and Ahn, Sungjin and Welling, Max},
booktitle = {International Conference on Artificial Intelligence and Statistics},
year = {2016},
pages = {723-731},
url = {https://mlanthology.org/aistats/2016/li2016aistats-scalable/}
}