Metrizing Weak Convergence with Maximum Mean Discrepancies
Abstract
This paper characterizes the maximum mean discrepancies (MMD) that metrize the weak convergence of probability measures for a wide class of kernels. More precisely, we prove that, on a locally compact, non-compact, Hausdorff space, the MMD of a bounded continuous Borel measurable kernel $k$, whose RKHS-functions vanish at infinity (i.e., $H_k \subset C_0$), metrizes the weak convergence of probability measures if and only if $k$ is continuous and integrally strictly positive definite ($\int$s.p.d.) over all signed, finite, regular Borel measures. We also correct a prior result of Simon-Gabriel and Schölkopf (JMLR 2018, Thm. 12) by showing that there exist both bounded continuous $\int$s.p.d. kernels that do not metrize weak convergence and bounded continuous non-$\int$s.p.d. kernels that do metrize it.
Cite
Text
Simon-Gabriel et al. "Metrizing Weak Convergence with Maximum Mean Discrepancies." Journal of Machine Learning Research, 2023.Markdown
[Simon-Gabriel et al. "Metrizing Weak Convergence with Maximum Mean Discrepancies." Journal of Machine Learning Research, 2023.](https://mlanthology.org/jmlr/2023/simongabriel2023jmlr-metrizing/)BibTeX
@article{simongabriel2023jmlr-metrizing,
title = {{Metrizing Weak Convergence with Maximum Mean Discrepancies}},
author = {Simon-Gabriel, Carl-Johann and Barp, Alessandro and Schölkopf, Bernhard and Mackey, Lester},
journal = {Journal of Machine Learning Research},
year = {2023},
pages = {1-20},
volume = {24},
url = {https://mlanthology.org/jmlr/2023/simongabriel2023jmlr-metrizing/}
}