A Finite-Particle Convergence Rate for Stein Variational Gradient Descent
Abstract
We provide a first finite-particle convergence rate for Stein variational gradient descent (SVGD). Specifically, whenever the target distribution is sub-Gaussian with a Lipschitz score, SVGD with $n$ particles and an appropriate step size sequence drives the kernel Stein discrepancy to zero at an order $1/\sqrt{\log\log n}$ rate. We suspect that the dependence on $n$ can be improved, and we hope that our explicit, non-asymptotic proof strategy will serve as a template for future refinements.
Cite
Text
Shi and Mackey. "A Finite-Particle Convergence Rate for Stein Variational Gradient Descent." NeurIPS 2022 Workshops: OPT, 2022.Markdown
[Shi and Mackey. "A Finite-Particle Convergence Rate for Stein Variational Gradient Descent." NeurIPS 2022 Workshops: OPT, 2022.](https://mlanthology.org/neuripsw/2022/shi2022neuripsw-finiteparticle/)BibTeX
@inproceedings{shi2022neuripsw-finiteparticle,
title = {{A Finite-Particle Convergence Rate for Stein Variational Gradient Descent}},
author = {Shi, Jiaxin and Mackey, Lester},
booktitle = {NeurIPS 2022 Workshops: OPT},
year = {2022},
url = {https://mlanthology.org/neuripsw/2022/shi2022neuripsw-finiteparticle/}
}