A Non-Asymptotic Analysis for Stein Variational Gradient Descent

Abstract

We study the Stein Variational Gradient Descent (SVGD) algorithm, which optimises a set of particles to approximate a target probability distribution $\pi\propto e^{-V}$ on $\R^d$. In the population limit, SVGD performs gradient descent in the space of probability distributions on the KL divergence with respect to $\pi$, where the gradient is smoothed through a kernel integral operator. In this paper, we provide a novel finite time analysis for the SVGD algorithm. We provide a descent lemma establishing that the algorithm decreases the objective at each iteration, and rates of convergence. We also provide a convergence result of the finite particle system corresponding to the practical implementation of SVGD to its population version.

Cite

Text

Korba et al. "A Non-Asymptotic Analysis for Stein Variational Gradient Descent." Neural Information Processing Systems, 2020.

Markdown

[Korba et al. "A Non-Asymptotic Analysis for Stein Variational Gradient Descent." Neural Information Processing Systems, 2020.](https://mlanthology.org/neurips/2020/korba2020neurips-nonasymptotic/)

BibTeX

@inproceedings{korba2020neurips-nonasymptotic,
  title     = {{A Non-Asymptotic Analysis for Stein Variational Gradient Descent}},
  author    = {Korba, Anna and Salim, Adil and Arbel, Michael and Luise, Giulia and Gretton, Arthur},
  booktitle = {Neural Information Processing Systems},
  year      = {2020},
  url       = {https://mlanthology.org/neurips/2020/korba2020neurips-nonasymptotic/}
}