Improved Analysis for a Proximal Algorithm for Sampling

Abstract

We study the proximal sampler of Lee, Shen, and Tian (2021) and obtain new convergence guarantees under weaker assumptions than strong log-concavity: namely, our results hold for (1) weakly log-concave targets, and (2) targets satisfying isoperimetric assumptions which allow for non-log-concavity. We demonstrate our results by obtaining new state-of-the-art sampling guarantees for several classes of target distributions. We also strengthen the connection between the proximal sampler and the proximal method in optimization by interpreting the former as an entropically regularized Wasserstein gradient flow and the latter as the limit of one.

Cite

Text

Chen et al. "Improved Analysis for a Proximal Algorithm for Sampling." Conference on Learning Theory, 2022.

Markdown

[Chen et al. "Improved Analysis for a Proximal Algorithm for Sampling." Conference on Learning Theory, 2022.](https://mlanthology.org/colt/2022/chen2022colt-improved/)

BibTeX

@inproceedings{chen2022colt-improved,
  title     = {{Improved Analysis for a Proximal Algorithm for Sampling}},
  author    = {Chen, Yongxin and Chewi, Sinho and Salim, Adil and Wibisono, Andre},
  booktitle = {Conference on Learning Theory},
  year      = {2022},
  pages     = {2984-3014},
  volume    = {178},
  url       = {https://mlanthology.org/colt/2022/chen2022colt-improved/}
}