Pseudo-Extended Markov Chain Monte Carlo

Abstract

Sampling from posterior distributions using Markov chain Monte Carlo (MCMC) methods can require an exhaustive number of iterations, particularly when the posterior is multi-modal as the MCMC sampler can become trapped in a local mode for a large number of iterations. In this paper, we introduce the pseudo-extended MCMC method as a simple approach for improving the mixing of the MCMC sampler for multi-modal posterior distributions. The pseudo-extended method augments the state-space of the posterior using pseudo-samples as auxiliary variables. On the extended space, the modes of the posterior are connected, which allows the MCMC sampler to easily move between well-separated posterior modes. We demonstrate that the pseudo-extended approach delivers improved MCMC sampling over the Hamiltonian Monte Carlo algorithm on multi-modal posteriors, including Boltzmann machines and models with sparsity-inducing priors.

Cite

Text

Nemeth et al. "Pseudo-Extended Markov Chain Monte Carlo." Neural Information Processing Systems, 2019.

Markdown

[Nemeth et al. "Pseudo-Extended Markov Chain Monte Carlo." Neural Information Processing Systems, 2019.](https://mlanthology.org/neurips/2019/nemeth2019neurips-pseudoextended/)

BibTeX

@inproceedings{nemeth2019neurips-pseudoextended,
  title     = {{Pseudo-Extended Markov Chain Monte Carlo}},
  author    = {Nemeth, Christopher and Lindsten, Fredrik and Filippone, Maurizio and Hensman, James},
  booktitle = {Neural Information Processing Systems},
  year      = {2019},
  pages     = {4312-4322},
  url       = {https://mlanthology.org/neurips/2019/nemeth2019neurips-pseudoextended/}
}