Robust Computation of Optimal Transport by $β$-Potential Regularization
Abstract
Optimal transport (OT) has become a widely used tool in the machine learning field to measure the discrepancy between probability distributions. For instance, OT is a popular loss function that quantifies the discrepancy between an empirical distribution and a parametric model. Recently, an entropic penalty term and the celebrated Sinkhorn algorithm have been commonly used to approximate the original OT in a computationally efficient way. However, since the Sinkhorn algorithm runs a projection associated with the Kullback-Leibler divergence, it is often vulnerable to outliers. To overcome this problem, we propose regularizing OT with the $\beta$-potential term associated with the so-called $\beta$-divergence, which was developed in robust statistics. Our theoretical analysis reveals that the $\beta$-potential can prevent the mass from being transported to outliers. We experimentally demonstrate that the transport matrix computed with our algorithm helps estimate a probability distribution robustly even in the presence of outliers. In addition, our proposed method can successfully detect outliers from a contaminated dataset.
Cite
Text
Nakamura et al. "Robust Computation of Optimal Transport by $β$-Potential Regularization." Proceedings of The 14th Asian Conference on Machine Learning, 2022.Markdown
[Nakamura et al. "Robust Computation of Optimal Transport by $β$-Potential Regularization." Proceedings of The 14th Asian Conference on Machine Learning, 2022.](https://mlanthology.org/acml/2022/nakamura2022acml-robust/)BibTeX
@inproceedings{nakamura2022acml-robust,
title = {{Robust Computation of Optimal Transport by $β$-Potential Regularization}},
author = {Nakamura, Shintaro and Bao, Han and Sugiyama, Masashi},
booktitle = {Proceedings of The 14th Asian Conference on Machine Learning},
year = {2022},
pages = {770-785},
volume = {189},
url = {https://mlanthology.org/acml/2022/nakamura2022acml-robust/}
}