Long-Time Asymptotics of Noisy SVGD Outside the Population Limit
Abstract
Stein Variational Gradient Descent (SVGD) is a widely used sampling algorithm that has been successfully applied in several areas of Machine Learning. SVGD operates by iteratively moving a set of $n$ interacting particles (which represent the samples) to approximate the target distribution. Despite recent studies on the complexity of SVGD and its variants, their long-time asymptotic behavior (i.e., after numerous iterations $k$) is still not understood in the finite number of particles regime. We study the long-time asymptotic behavior of a noisy variant of SVGD. First, we establish that the limit set of noisy SVGD for large $k$ is well-defined. We then characterize this limit set, showing that it approaches the target distribution as $n$ increases. In particular, noisy SVGD avoids the variance collapse observed for SVGD. Our approach involves demonstrating that the trajectories of noisy SVGD closely resemble those described by a McKean-Vlasov process.
Cite
Text
Priser et al. "Long-Time Asymptotics of Noisy SVGD Outside the Population Limit." International Conference on Learning Representations, 2025.Markdown
[Priser et al. "Long-Time Asymptotics of Noisy SVGD Outside the Population Limit." International Conference on Learning Representations, 2025.](https://mlanthology.org/iclr/2025/priser2025iclr-longtime/)BibTeX
@inproceedings{priser2025iclr-longtime,
title = {{Long-Time Asymptotics of Noisy SVGD Outside the Population Limit}},
author = {Priser, Victor and Bianchi, Pascal and Salim, Adil},
booktitle = {International Conference on Learning Representations},
year = {2025},
url = {https://mlanthology.org/iclr/2025/priser2025iclr-longtime/}
}