Herded Gibbs Sampling
Abstract
The Gibbs sampler is one of the most popular algorithms for inference in statistical models. In this paper, we introduce a herding variant of this algorithm, called herded Gibbs, that is entirely deterministic. We prove that herded Gibbs has an $O(1/T)$ convergence rate for models with independent variables and for fully connected probabilistic graphical models. Herded Gibbs is shown to outperform Gibbs in the tasks of image denoising with MRFs and named entity recognition with CRFs. However, the convergence for herded Gibbs for sparsely connected probabilistic graphical models is still an open problem.
Cite
Text
Chen et al. "Herded Gibbs Sampling." Journal of Machine Learning Research, 2016.Markdown
[Chen et al. "Herded Gibbs Sampling." Journal of Machine Learning Research, 2016.](https://mlanthology.org/jmlr/2016/chen2016jmlr-herded/)BibTeX
@article{chen2016jmlr-herded,
title = {{Herded Gibbs Sampling}},
author = {Chen, Yutian and Bornn, Luke and de Freitas, Nando and Eskelin, Mareija and Fang, Jing and Welling, Max},
journal = {Journal of Machine Learning Research},
year = {2016},
pages = {1-29},
volume = {17},
url = {https://mlanthology.org/jmlr/2016/chen2016jmlr-herded/}
}