Low-Rank Sinkhorn Factorization

Abstract

Several recent applications of optimal transport (OT) theory to machine learning have relied on regularization, notably entropy and the Sinkhorn algorithm. Because matrix-vector products are pervasive in the Sinkhorn algorithm, several works have proposed to \textit{approximate} kernel matrices appearing in its iterations using low-rank factors. Another route lies instead in imposing low-nonnegative rank constraints on the feasible set of couplings considered in OT problems, with no approximations on cost nor kernel matrices. This route was first explored by \citet{forrow2018statistical}, who proposed an algorithm tailored for the squared Euclidean ground cost, using a proxy objective that can be solved through the machinery of regularized 2-Wasserstein barycenters. Building on this, we introduce in this work a generic approach that aims at solving, in full generality, the OT problem under low-nonnegative rank constraints with arbitrary costs. Our algorithm relies on an explicit factorization of low-rank couplings as a product of \textit{sub-coupling} factors linked by a common marginal; similar to an NMF approach, we alternatively updates these factors. We prove the non-asymptotic stationary convergence of this algorithm and illustrate its efficiency on benchmark experiments.

Cite

Text

Scetbon et al. "Low-Rank Sinkhorn Factorization." International Conference on Machine Learning, 2021.

Markdown

[Scetbon et al. "Low-Rank Sinkhorn Factorization." International Conference on Machine Learning, 2021.](https://mlanthology.org/icml/2021/scetbon2021icml-lowrank/)

BibTeX

@inproceedings{scetbon2021icml-lowrank,
  title     = {{Low-Rank Sinkhorn Factorization}},
  author    = {Scetbon, Meyer and Cuturi, Marco and Peyré, Gabriel},
  booktitle = {International Conference on Machine Learning},
  year      = {2021},
  pages     = {9344-9354},
  volume    = {139},
  url       = {https://mlanthology.org/icml/2021/scetbon2021icml-lowrank/}
}