Asymptotics of Smoothed Wasserstein Distances in the Small Noise Regime
Abstract
We study the behavior of the Wasserstein-$2$ distance between discrete measures $\mu$ and $\nu$ in $\mathbb{R}^d$ when both measures are smoothed by small amounts of Gaussian noise. This procedure, known as Gaussian-smoothed optimal transport, has recently attracted attention as a statistically attractive alternative to the unregularized Wasserstein distance. We give precise bounds on the approximation properties of this proposal in the small noise regime, and establish the existence of a phase transition: we show that, if the optimal transport plan from $\mu$ to $\nu$ is unique and a perfect matching, there exists a critical threshold such that the difference between $W_2(\mu, \nu)$ and the Gaussian-smoothed OT distance $W_2(\mu \ast \mathcal{N}_\sigma, \nu\ast \mathcal{N}_\sigma)$ scales like $\exp(-c /\sigma^2)$ for $\sigma$ below the threshold, and scales like $\sigma$ above it. These results establish that for $\sigma$ sufficiently small, the smoothed Wasserstein distance approximates the unregularized distance exponentially well.
Cite
Text
Ding and Niles-Weed. "Asymptotics of Smoothed Wasserstein Distances in the Small Noise Regime." Neural Information Processing Systems, 2022.Markdown
[Ding and Niles-Weed. "Asymptotics of Smoothed Wasserstein Distances in the Small Noise Regime." Neural Information Processing Systems, 2022.](https://mlanthology.org/neurips/2022/ding2022neurips-asymptotics/)BibTeX
@inproceedings{ding2022neurips-asymptotics,
title = {{Asymptotics of Smoothed Wasserstein Distances in the Small Noise Regime}},
author = {Ding, Yunzi and Niles-Weed, Jonathan},
booktitle = {Neural Information Processing Systems},
year = {2022},
url = {https://mlanthology.org/neurips/2022/ding2022neurips-asymptotics/}
}