Gaussian-Smoothed Optimal Transport: Metric Structure and Statistical Efficiency
Abstract
Optimal transport (OT), and in particular the Wasserstein distance, has seen a surge of interest and applications in machine learning. However, empirical approximation under Wasserstein distances suffers from a severe curse of dimensionality, rendering them impractical in high dimensions. As a result, entropically regularized OT has become a popular workaround. However, while it enjoys fast algorithms and better statistical properties, it looses the metric structure that Wasserstein distances enjoy. This work proposes a novel Gaussian-smoothed OT (GOT) framework, that achieves the best of both worlds: preserving the 1-Wasserstein metric structure while alleviating the empirical approximation curse of dimensionality. Furthermore, as the Gaussian-smoothing parameter shrinks to zero, GOT $\Gamma$-converges towards classic OT (with convergence of optimizers), thus serving as a natural extension. An empirical study that validates the theoretical results is provided, promoting Gaussian-smoothed OT as a powerful alternative to entropic OT.
Cite
Text
Goldfeld and Greenewald. "Gaussian-Smoothed Optimal Transport: Metric Structure and Statistical Efficiency." Artificial Intelligence and Statistics, 2020.Markdown
[Goldfeld and Greenewald. "Gaussian-Smoothed Optimal Transport: Metric Structure and Statistical Efficiency." Artificial Intelligence and Statistics, 2020.](https://mlanthology.org/aistats/2020/goldfeld2020aistats-gaussiansmoothed/)BibTeX
@inproceedings{goldfeld2020aistats-gaussiansmoothed,
title = {{Gaussian-Smoothed Optimal Transport: Metric Structure and Statistical Efficiency}},
author = {Goldfeld, Ziv and Greenewald, Kristjan},
booktitle = {Artificial Intelligence and Statistics},
year = {2020},
pages = {3327-3337},
volume = {108},
url = {https://mlanthology.org/aistats/2020/goldfeld2020aistats-gaussiansmoothed/}
}