SVGD as a Kernelized Wasserstein Gradient Flow of the Chi-Squared Divergence
Abstract
Stein Variational Gradient Descent (SVGD), a popular sampling algorithm, is often described as the kernelized gradient flow for the Kullback-Leibler divergence in the geometry of optimal transport. We introduce a new perspective on SVGD that instead views SVGD as the kernelized gradient flow of the chi-squared divergence. Motivated by this perspective, we provide a convergence analysis of the chi-squared gradient flow. We also show that our new perspective provides better guidelines for choosing effective kernels for SVGD.
Cite
Text
Chewi et al. "SVGD as a Kernelized Wasserstein Gradient Flow of the Chi-Squared Divergence." Neural Information Processing Systems, 2020.Markdown
[Chewi et al. "SVGD as a Kernelized Wasserstein Gradient Flow of the Chi-Squared Divergence." Neural Information Processing Systems, 2020.](https://mlanthology.org/neurips/2020/chewi2020neurips-svgd/)BibTeX
@inproceedings{chewi2020neurips-svgd,
title = {{SVGD as a Kernelized Wasserstein Gradient Flow of the Chi-Squared Divergence}},
author = {Chewi, Sinho and Le Gouic, Thibaut and Lu, Chen and Maunu, Tyler and Rigollet, Philippe},
booktitle = {Neural Information Processing Systems},
year = {2020},
url = {https://mlanthology.org/neurips/2020/chewi2020neurips-svgd/}
}