Sample Complexity Bounds for Estimating the Wasserstein Distance Under Invariances
Abstract
Group-invariant probability distributions appear in many data-generative models in machine learning, such as graphs, point clouds, and images. In practice, one often needs to estimate divergences between such distributions. In this work, we study how the inherent invariances with respect to any smooth action of a Lie group on a manifold improve sample complexity when estimating the Wasserstein distance. Our result indicates a two-fold gain: (1) reducing the sample complexity by a multiplicative factor corresponding to the group size (for finite groups) or the normalized volume of the quotient space (for groups of positive dimension), (2) improving the exponent in the convergence rate (for groups of positive dimension). These results are completely new for groups of positive dimension and tighten recent bounds for finite group actions.
Cite
Text
Tahmasebi and Jegelka. "Sample Complexity Bounds for Estimating the Wasserstein Distance Under Invariances." ICML 2023 Workshops: TAGML, 2023.Markdown
[Tahmasebi and Jegelka. "Sample Complexity Bounds for Estimating the Wasserstein Distance Under Invariances." ICML 2023 Workshops: TAGML, 2023.](https://mlanthology.org/icmlw/2023/tahmasebi2023icmlw-sample/)BibTeX
@inproceedings{tahmasebi2023icmlw-sample,
title = {{Sample Complexity Bounds for Estimating the Wasserstein Distance Under Invariances}},
author = {Tahmasebi, Behrooz and Jegelka, Stefanie},
booktitle = {ICML 2023 Workshops: TAGML},
year = {2023},
url = {https://mlanthology.org/icmlw/2023/tahmasebi2023icmlw-sample/}
}