On the Convergence of Projected Bures-Wasserstein Gradient Descent Under Euclidean Strong Convexity

Abstract

The Bures-Wasserstein (BW) gradient descent method has gained considerable attention in various domains, including Gaussian barycenter, matrix recovery and variational inference problems, due to its alignment with the Wasserstein geometry of normal distributions. Despite its popularity, existing convergence analysis are often contingent upon specific loss functions, and the exploration of constrained settings within this framework remains limited. In this work, we make an attempt to bridge this gap by providing a general convergence rate guarantee for BW gradient descent when the Euclidean strong convexity of the loss and the constraints is assumed. In an effort to advance practical implementations, we also derive a closed-form solution for the projection onto BW distance-constrained sets, which enables the fast implementation of projected BW gradient descent for problems that arise in the constrained barycenter and distributionally robust optimization literature. Experimental results demonstrate significant improvements in computational efficiency and convergence speed, underscoring the efficacy of our method in practical scenarios.

Cite

Text

Fan et al. "On the Convergence of Projected Bures-Wasserstein Gradient Descent Under Euclidean Strong Convexity." International Conference on Machine Learning, 2024.

Markdown

[Fan et al. "On the Convergence of Projected Bures-Wasserstein Gradient Descent Under Euclidean Strong Convexity." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/fan2024icml-convergence/)

BibTeX

@inproceedings{fan2024icml-convergence,
  title     = {{On the Convergence of Projected Bures-Wasserstein Gradient Descent Under Euclidean Strong Convexity}},
  author    = {Fan, Junyi and Han, Yuxuan and Liu, Zijian and Cai, Jian-Feng and Wang, Yang and Zhou, Zhengyuan},
  booktitle = {International Conference on Machine Learning},
  year      = {2024},
  pages     = {12832-12857},
  volume    = {235},
  url       = {https://mlanthology.org/icml/2024/fan2024icml-convergence/}
}