Convergence of Langevin Monte Carlo in Chi-Squared and Rényi Divergence
Abstract
We study sampling from a target distribution $\nu_* = e^{-f}$ using the unadjusted Langevin Monte Carlo (LMC) algorithm when the potential $f$ satisfies a strong dissipativity condition and it is first-order smooth with a Lipschitz gradient. We prove that, initialized with a Gaussian random vector that has sufficiently small variance, iterating the LMC algorithm for $\widetilde{\mathcal{O}}(\lambda^2 d\epsilon^{-1})$ steps is sufficient to reach $\epsilon$-neighborhood of the target in both Chi-squared and Rényi divergence, where $\lambda$ is the logarithmic Sobolev constant of $\nu_*$. Our results do not require warm-start to deal with the exponential dimension dependency in Chi-squared divergence at initialization. In particular, for strongly convex and first-order smooth potentials, we show that the LMC algorithm achieves the rate estimate $\widetilde{\mathcal{O}}(d\epsilon^{-1})$ which improves the previously known rates in both of these metrics, under the same assumptions. Translating this rate to other metrics, our results also recover the state-of-the-art rate estimates in KL divergence, total variation and $2$-Wasserstein distance in the same setup. Finally, as we rely on the logarithmic Sobolev inequality, our framework covers a range of non-convex potentials that are first-order smooth and exhibit strong convexity outside of a compact region.
Cite
Text
Erdogdu et al. "Convergence of Langevin Monte Carlo in Chi-Squared and Rényi Divergence." Artificial Intelligence and Statistics, 2022.Markdown
[Erdogdu et al. "Convergence of Langevin Monte Carlo in Chi-Squared and Rényi Divergence." Artificial Intelligence and Statistics, 2022.](https://mlanthology.org/aistats/2022/erdogdu2022aistats-convergence/)BibTeX
@inproceedings{erdogdu2022aistats-convergence,
title = {{Convergence of Langevin Monte Carlo in Chi-Squared and Rényi Divergence}},
author = {Erdogdu, Murat A. and Hosseinzadeh, Rasa and Zhang, Shunshi},
booktitle = {Artificial Intelligence and Statistics},
year = {2022},
pages = {8151-8175},
volume = {151},
url = {https://mlanthology.org/aistats/2022/erdogdu2022aistats-convergence/}
}