Continuously Tempered Hamiltonian Monte Carlo

Abstract

Hamiltonian Monte Carlo (HMC) is a powerful Markov chain Monte Carlo (MCMC) method for performing approximate inference in complex probabilistic models of continuous variables. In common with many MCMC methods, however, the standard HMC approach performs poorly in distributions with multiple isolated modes. We present a method for augmenting the Hamiltonian system with an extra continuous temperature control variable which allows the dynamic to bridge between sampling a complex target distribution and a simpler unimodal base distribution. This augmentation both helps improve mixing in multimodal targets and allows the normalisation constant of the target distribution to be estimated. The method is simple to implement within existing HMC code, requiring only a standard leapfrog integrator. We demonstrate experimentally that the method is competitive with annealed importance sampling and simulating tempering methods at sampling from challenging multimodal distributions and estimating their normalising constants.

Cite

Text

Graham and Storkey. "Continuously Tempered Hamiltonian Monte Carlo." Conference on Uncertainty in Artificial Intelligence, 2017.

Markdown

[Graham and Storkey. "Continuously Tempered Hamiltonian Monte Carlo." Conference on Uncertainty in Artificial Intelligence, 2017.](https://mlanthology.org/uai/2017/graham2017uai-continuously/)

BibTeX

@inproceedings{graham2017uai-continuously,
  title     = {{Continuously Tempered Hamiltonian Monte Carlo}},
  author    = {Graham, Matthew M. and Storkey, Amos J.},
  booktitle = {Conference on Uncertainty in Artificial Intelligence},
  year      = {2017},
  url       = {https://mlanthology.org/uai/2017/graham2017uai-continuously/}
}