Convergence Aspects of Hybrid Kernel SVGD

Abstract

Stein variational gradient descent (SVGD) is a particle-based approximate inference algorithm. Many variants of SVGD have been proposed in recent years, including the hybrid kernel variant (h-SVGD), which has demonstrated promising results on image classification with deep neural network ensembles. By framing h-SVGD as a kernelised Wasserstein gradient flow on a functional that is not the Kullback-Leibler divergence, we demonstrate that h-SVGD does not converge to the target distribution in the mean field limit. Despite this theoretical result, we provide intuition and experimental support for the ability of h-SVGD to improve variance estimation in high dimensions. Unlike other SVGD variants that also alleviate variance collapse, this is achieved at no additional computational cost and without further assumptions on the posterior.

Cite

Text

MacDonald et al. "Convergence Aspects of Hybrid Kernel SVGD." Transactions on Machine Learning Research, 2025.

Markdown

[MacDonald et al. "Convergence Aspects of Hybrid Kernel SVGD." Transactions on Machine Learning Research, 2025.](https://mlanthology.org/tmlr/2025/macdonald2025tmlr-convergence/)

BibTeX

@article{macdonald2025tmlr-convergence,
  title     = {{Convergence Aspects of Hybrid Kernel SVGD}},
  author    = {MacDonald, Anson and Sisson, Scott A and Pathiraja, Sahani},
  journal   = {Transactions on Machine Learning Research},
  year      = {2025},
  url       = {https://mlanthology.org/tmlr/2025/macdonald2025tmlr-convergence/}
}