Wasserstein Coreset via Sinkhorn Loss
Abstract
Coreset selection, a technique for compressing large datasets while preserving performance, is crucial for modern machine learning. This paper presents a novel method for generating high-quality Wasserstein coresets using the Sinkhorn loss, a powerful tool with computational advantages. However, existing approaches suffer from numerical instability in Sinkhorn's algorithm. We address this by proposing stable algorithms for the computation and differentiation of the Sinkhorn optimization problem, including an analytical formula for the derivative of the Sinkhorn loss and a rigorous stability analysis of our method. Extensive experiments demonstrate that our approach significantly outperforms existing methods in terms of sample selection quality, computational efficiency, and achieving a smaller Wasserstein distance.
Cite
Text
Yin et al. "Wasserstein Coreset via Sinkhorn Loss." Transactions on Machine Learning Research, 2025.Markdown
[Yin et al. "Wasserstein Coreset via Sinkhorn Loss." Transactions on Machine Learning Research, 2025.](https://mlanthology.org/tmlr/2025/yin2025tmlr-wasserstein/)BibTeX
@article{yin2025tmlr-wasserstein,
title = {{Wasserstein Coreset via Sinkhorn Loss}},
author = {Yin, Haoyun and Qiu, Yixuan and Wang, Xiao},
journal = {Transactions on Machine Learning Research},
year = {2025},
url = {https://mlanthology.org/tmlr/2025/yin2025tmlr-wasserstein/}
}