Convergence of Score-Based Generative Modeling for General Data Distributions

Abstract

We give polynomial convergence guarantees for denoising diffusion models that do not rely on the data distribution satisfying functional inequalities or strong smoothness assumptions. Assuming a $L^2$-accurate score estimate, we obtain Wasserstein distance guarantees for any distributions of bounded support or sufficiently decaying tails, as well as TV guarantees for distributions with further smoothness assumptions.

Cite

Text

Lee et al. "Convergence of Score-Based Generative Modeling for General Data Distributions." NeurIPS 2022 Workshops: SBM, 2022.

Markdown

[Lee et al. "Convergence of Score-Based Generative Modeling for General Data Distributions." NeurIPS 2022 Workshops: SBM, 2022.](https://mlanthology.org/neuripsw/2022/lee2022neuripsw-convergence/)

BibTeX

@inproceedings{lee2022neuripsw-convergence,
  title     = {{Convergence of Score-Based Generative Modeling for General Data Distributions}},
  author    = {Lee, Holden and Lu, Jianfeng and Tan, Yixin},
  booktitle = {NeurIPS 2022 Workshops: SBM},
  year      = {2022},
  url       = {https://mlanthology.org/neuripsw/2022/lee2022neuripsw-convergence/}
}