Fast Conditional Mixing of MCMC Algorithms for Non-Log-Concave Distributions
Abstract
MCMC algorithms offer empirically efficient tools for sampling from a target distribution $\pi(x) \propto \exp(-V(x))$. However, on the theory side, MCMC algorithms suffer from slow mixing rate when $\pi(x)$ is non-log-concave. Our work examines this gap and shows that when Poincar\'e-style inequality holds on a subset $\mathcal{X}$ of the state space, the conditional distribution of MCMC iterates over $\mathcal{X}$ mixes fast to the true conditional distribution. This fast mixing guarantee can hold in cases when global mixing is provably slow. We formalize the statement and quantify the conditional mixing rate. We further show that conditional mixing can have interesting implications for sampling from mixtures of Gaussians, parameter estimation for Gaussian mixture models, and Gibbs-sampling with well-connected local minima.
Cite
Text
Cheng et al. "Fast Conditional Mixing of MCMC Algorithms for Non-Log-Concave Distributions." Neural Information Processing Systems, 2023.Markdown
[Cheng et al. "Fast Conditional Mixing of MCMC Algorithms for Non-Log-Concave Distributions." Neural Information Processing Systems, 2023.](https://mlanthology.org/neurips/2023/cheng2023neurips-fast/)BibTeX
@inproceedings{cheng2023neurips-fast,
title = {{Fast Conditional Mixing of MCMC Algorithms for Non-Log-Concave Distributions}},
author = {Cheng, Xiang and Wang, Bohan and Zhang, Jingzhao and Zhu, Yusong},
booktitle = {Neural Information Processing Systems},
year = {2023},
url = {https://mlanthology.org/neurips/2023/cheng2023neurips-fast/}
}