Wasserstein Distance Estimates for the Distributions of Numerical Approximations to Ergodic Stochastic Differential Equations

Abstract

We present a framework that allows for the non-asymptotic study of the $2$-Wasserstein distance between the invariant distribution of an ergodic stochastic differential equation and the distribution of its numerical approximation in the strongly log-concave case. This allows us to study in a unified way a number of different integrators proposed in the literature for the overdamped and underdamped Langevin dynamics. In addition, we analyze a novel splitting method for the underdamped Langevin dynamics which only requires one gradient evaluation per time step. Under an additional smoothness assumption on a $d$--dimensional strongly log-concave distribution with condition number $\kappa$, the algorithm is shown to produce with an $\mathcal{O}\big(\kappa^{5/4} d^{1/4}\epsilon^{-1/2} \big)$ complexity samples from a distribution that, in Wasserstein distance, is at most $\epsilon>0$ away from the target distribution.

Cite

Text

Sanz-Serna and Zygalakis. "Wasserstein Distance Estimates for the Distributions of Numerical Approximations to  Ergodic Stochastic Differential Equations." Journal of Machine Learning Research, 2021.

Markdown

[Sanz-Serna and Zygalakis. "Wasserstein Distance Estimates for the Distributions of Numerical Approximations to  Ergodic Stochastic Differential Equations." Journal of Machine Learning Research, 2021.](https://mlanthology.org/jmlr/2021/sanzserna2021jmlr-wasserstein/)

BibTeX

@article{sanzserna2021jmlr-wasserstein,
  title     = {{Wasserstein Distance Estimates for the Distributions of Numerical Approximations to  Ergodic Stochastic Differential Equations}},
  author    = {Sanz-Serna, Jesus Maria and Zygalakis, Konstantinos C.},
  journal   = {Journal of Machine Learning Research},
  year      = {2021},
  pages     = {1-37},
  volume    = {22},
  url       = {https://mlanthology.org/jmlr/2021/sanzserna2021jmlr-wasserstein/}
}