Local-Global MCMC Kernels: The Best of Both Worlds

Abstract

Recent works leveraging learning to enhance sampling have shown promising results, in particular by designing effective non-local moves and global proposals. However, learning accuracy is inevitably limited in regions where little data is available such as in the tails of distributions as well as in high-dimensional problems. In the present paper we study an Explore-Exploit Markov chain Monte Carlo strategy ($\operatorname{Ex^2MCMC}$) that combines local and global samplers showing that it enjoys the advantages of both approaches. We prove $V$-uniform geometric ergodicity of $\operatorname{Ex^2MCMC}$ without requiring a uniform adaptation of the global sampler to the target distribution. We also compute explicit bounds on the mixing rate of the Explore-Exploit strategy under realistic conditions. Moreover, we propose an adaptive version of the strategy ($\operatorname{FlEx^2MCMC}$) where a normalizing flow is trained while sampling to serve as a proposal for global moves. We illustrate the efficiency of $\operatorname{Ex^2MCMC}$ and its adaptive version on classical sampling benchmarks as well as in sampling high-dimensional distributions defined by Generative Adversarial Networks seen as Energy Based Models.

Cite

Text

Samsonov et al. "Local-Global MCMC Kernels: The Best of Both Worlds." Neural Information Processing Systems, 2022.

Markdown

[Samsonov et al. "Local-Global MCMC Kernels: The Best of Both Worlds." Neural Information Processing Systems, 2022.](https://mlanthology.org/neurips/2022/samsonov2022neurips-localglobal/)

BibTeX

@inproceedings{samsonov2022neurips-localglobal,
  title     = {{Local-Global MCMC Kernels: The Best of Both Worlds}},
  author    = {Samsonov, Sergey and Lagutin, Evgeny and Gabrié, Marylou and Durmus, Alain and Naumov, Alexey and Moulines, Eric},
  booktitle = {Neural Information Processing Systems},
  year      = {2022},
  url       = {https://mlanthology.org/neurips/2022/samsonov2022neurips-localglobal/}
}