Targeted Separation and Convergence with Kernel Discrepancies
Abstract
Kernel Stein discrepancies (KSDs) are maximum mean discrepancies (MMDs) that leverage the score information of distributions, and have grown central to a wide range of applications. In most settings, these MMDs are required to $(i)$ separate a target $\mathrm{P}$ from other probability measures or even $(ii)$ control weak convergence to $\mathrm{P}$. In this article we derive new sufficient and necessary conditions that substantially broaden the known conditions for KSD separation and convergence control, and develop the first KSDs known to metrize weak convergence to $\mathrm{P}$. Along the way, we highlight the implications of our results for hypothesis testing, measuring and improving sample quality, and sampling with Stein variational gradient descent.
Cite
Text
Barp et al. "Targeted Separation and Convergence with Kernel Discrepancies." NeurIPS 2022 Workshops: SBM, 2022.Markdown
[Barp et al. "Targeted Separation and Convergence with Kernel Discrepancies." NeurIPS 2022 Workshops: SBM, 2022.](https://mlanthology.org/neuripsw/2022/barp2022neuripsw-targeted/)BibTeX
@inproceedings{barp2022neuripsw-targeted,
title = {{Targeted Separation and Convergence with Kernel Discrepancies}},
author = {Barp, Alessandro and Simon-Gabriel, Carl-Johann and Girolami, Mark and Mackey, Lester},
booktitle = {NeurIPS 2022 Workshops: SBM},
year = {2022},
url = {https://mlanthology.org/neuripsw/2022/barp2022neuripsw-targeted/}
}