Accelerating Convergence of Score-Based Diffusion Models, Provably

Abstract

Score-based diffusion models, while achieving remarkable empirical performance, often suffer from low sampling speed, due to extensive function evaluations needed during the sampling phase. Despite a flurry of recent activities towards speeding up diffusion generative modeling in practice, theoretical underpinnings for acceleration techniques remain severely limited. In this paper, we design novel training-free algorithms to accelerate popular deterministic (i.e., DDIM) and stochastic (i.e., DDPM) samplers. Our accelerated deterministic sampler converges at a rate $O(\frac{1}{{T}^2})$ with $T$ the number of steps, improving upon the $O(\frac{1}{T})$ rate for the DDIM sampler; and our accelerated stochastic sampler converges at a rate $O(\frac{1}{T})$, outperforming the rate $O(\frac{1}{\sqrt{T}})$ for the DDPM sampler. The design of our algorithms leverages insights from higher-order approximation, and shares similar intuitions as popular high-order ODE solvers like the DPM-Solver-2. Our theory accommodates $\ell_2$-accurate score estimates, and does not require log-concavity or smoothness on the target distribution.

Cite

Text

Li et al. "Accelerating Convergence of Score-Based Diffusion Models, Provably." International Conference on Machine Learning, 2024.

Markdown

[Li et al. "Accelerating Convergence of Score-Based Diffusion Models, Provably." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/li2024icml-accelerating/)

BibTeX

@inproceedings{li2024icml-accelerating,
  title     = {{Accelerating Convergence of Score-Based Diffusion Models, Provably}},
  author    = {Li, Gen and Huang, Yu and Efimov, Timofey and Wei, Yuting and Chi, Yuejie and Chen, Yuxin},
  booktitle = {International Conference on Machine Learning},
  year      = {2024},
  pages     = {27942-27954},
  volume    = {235},
  url       = {https://mlanthology.org/icml/2024/li2024icml-accelerating/}
}