Fast Convergence of Langevin Dynamics on Manifold: Geodesics Meet Log-Sobolev
Abstract
Sampling is a fundamental and arguably very important task with numerous applications in Machine Learning. One approach to sample from a high dimensional distribution $e^{-f}$ for some function $f$ is the Langevin Algorithm (LA). Recently, there has been a lot of progress in showing fast convergence of LA even in cases where $f$ is non-convex, notably \cite{VW19}, \cite{MoritaRisteski} in which the former paper focuses on functions $f$ defined in $\mathbb{R}^n$ and the latter paper focuses on functions with symmetries (like matrix completion type objectives) with manifold structure. Our work generalizes the results of \cite{VW19} where $f$ is defined on a manifold $M$ rather than $\mathbb{R}^n$. From technical point of view, we show that KL decreases in a geometric rate whenever the distribution $e^{-f}$ satisfies a log-Sobolev inequality on $M$.
Cite
Text
Wang et al. "Fast Convergence of Langevin Dynamics on Manifold: Geodesics Meet Log-Sobolev." Neural Information Processing Systems, 2020.Markdown
[Wang et al. "Fast Convergence of Langevin Dynamics on Manifold: Geodesics Meet Log-Sobolev." Neural Information Processing Systems, 2020.](https://mlanthology.org/neurips/2020/wang2020neurips-fast/)BibTeX
@inproceedings{wang2020neurips-fast,
title = {{Fast Convergence of Langevin Dynamics on Manifold: Geodesics Meet Log-Sobolev}},
author = {Wang, Xiao and Lei, Qi and Panageas, Ioannis},
booktitle = {Neural Information Processing Systems},
year = {2020},
url = {https://mlanthology.org/neurips/2020/wang2020neurips-fast/}
}