The Probability Flow ODE Is Provably Fast
Abstract
We provide the first polynomial-time convergence guarantees for the probabilistic flow ODE implementation (together with a corrector step) of score-based generative modeling. Our analysis is carried out in the wake of recent results obtaining such guarantees for the SDE-based implementation (i.e., denoising diffusion probabilistic modeling or DDPM), but requires the development of novel techniques for studying deterministic dynamics without contractivity. Through the use of a specially chosen corrector step based on the underdamped Langevin diffusion, we obtain better dimension dependence than prior works on DDPM ($O(\sqrt d)$ vs. $O(d)$, assuming smoothness of the data distribution), highlighting potential advantages of the ODE framework.
Cite
Text
Chen et al. "The Probability Flow ODE Is Provably Fast." Neural Information Processing Systems, 2023.Markdown
[Chen et al. "The Probability Flow ODE Is Provably Fast." Neural Information Processing Systems, 2023.](https://mlanthology.org/neurips/2023/chen2023neurips-probability/)BibTeX
@inproceedings{chen2023neurips-probability,
title = {{The Probability Flow ODE Is Provably Fast}},
author = {Chen, Sitan and Chewi, Sinho and Lee, Holden and Li, Yuanzhi and Lu, Jianfeng and Salim, Adil},
booktitle = {Neural Information Processing Systems},
year = {2023},
url = {https://mlanthology.org/neurips/2023/chen2023neurips-probability/}
}