A Finite-Particle Convergence Rate for Stein Variational Gradient Descent
Abstract
We provide the first finite-particle convergence rate for Stein variational gradient descent (SVGD), a popular algorithm for approximating a probability distribution with a collection of particles. Specifically, whenever the target distribution is sub-Gaussian with a Lipschitz score, SVGD with $n$ particles and an appropriate step size sequence drives the kernel Stein discrepancy to zero at an order ${1/}{\sqrt{\log\log n}}$ rate. We suspect that the dependence on $n$ can be improved, and we hope that our explicit, non-asymptotic proof strategy will serve as a template for future refinements.
Cite
Text
Shi and Mackey. "A Finite-Particle Convergence Rate for Stein Variational Gradient Descent." Neural Information Processing Systems, 2023.Markdown
[Shi and Mackey. "A Finite-Particle Convergence Rate for Stein Variational Gradient Descent." Neural Information Processing Systems, 2023.](https://mlanthology.org/neurips/2023/shi2023neurips-finiteparticle/)BibTeX
@inproceedings{shi2023neurips-finiteparticle,
title = {{A Finite-Particle Convergence Rate for Stein Variational Gradient Descent}},
author = {Shi, Jiaxin and Mackey, Lester W.},
booktitle = {Neural Information Processing Systems},
year = {2023},
url = {https://mlanthology.org/neurips/2023/shi2023neurips-finiteparticle/}
}