Wasserstein Convergence Guarantees for a General Class of Score-Based Generative Models
Abstract
Score-based generative models are a recent class of deep generative models with state-of-the-art performance in many applications. In this paper, we establish convergence guarantees for a general class of score-based generative models in the 2-Wasserstein distance, assuming accurate score estimates and smooth log-concave data distribution. We specialize our results to several concrete score-based generative models with specific choices of forward processes modeled by stochastic differential equations, and obtain an upper bound on the iteration complexity for each model, which demonstrates the impacts of different choices of the forward processes. We also provide a lower bound when the data distribution is Gaussian. Numerically, we experiment with score-based generative models with different forward processes for unconditional image generation on CIFAR-10. We find that the experimental results are in good agreement with our theoretical predictions on the iteration complexity.
Cite
Text
Gao et al. "Wasserstein Convergence Guarantees for a General Class of Score-Based Generative Models." Journal of Machine Learning Research, 2025.Markdown
[Gao et al. "Wasserstein Convergence Guarantees for a General Class of Score-Based Generative Models." Journal of Machine Learning Research, 2025.](https://mlanthology.org/jmlr/2025/gao2025jmlr-wasserstein/)BibTeX
@article{gao2025jmlr-wasserstein,
title = {{Wasserstein Convergence Guarantees for a General Class of Score-Based Generative Models}},
author = {Gao, Xuefeng and Nguyen, Hoang M. and Zhu, Lingjiong},
journal = {Journal of Machine Learning Research},
year = {2025},
pages = {1-54},
volume = {26},
url = {https://mlanthology.org/jmlr/2025/gao2025jmlr-wasserstein/}
}