Restoration-Degradation Beyond Linear Diffusions: A Non-Asymptotic Analysis for DDIM-Type Samplers

Abstract

We develop a framework for non-asymptotic analysis of deterministic samplers used for diffusion generative modeling. Several recent works have analyzed stochastic samplers using tools like Girsanov’s theorem and a chain rule variant of the interpolation argument. Unfortunately, these techniques give vacuous bounds when applied to deterministic samplers. We give a new operational interpretation for deterministic sampling by showing that one step along the probability flow ODE can be expressed as two steps: 1) a restoration step that runs gradient ascent on the conditional log-likelihood at some infinitesimally previous time, and 2) a degradation step that runs the forward process using noise pointing back towards the current iterate. This perspective allows us to extend denoising diffusion implicit models to general, non-linear forward processes. We then develop the first polynomial convergence bounds for these samplers under mild conditions on the data distribution.

Cite

Text

Chen et al. "Restoration-Degradation Beyond Linear Diffusions: A Non-Asymptotic Analysis for DDIM-Type Samplers." International Conference on Machine Learning, 2023.

Markdown

[Chen et al. "Restoration-Degradation Beyond Linear Diffusions: A Non-Asymptotic Analysis for DDIM-Type Samplers." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/chen2023icml-restorationdegradation/)

BibTeX

@inproceedings{chen2023icml-restorationdegradation,
  title     = {{Restoration-Degradation Beyond Linear Diffusions: A Non-Asymptotic Analysis for DDIM-Type Samplers}},
  author    = {Chen, Sitan and Daras, Giannis and Dimakis, Alex},
  booktitle = {International Conference on Machine Learning},
  year      = {2023},
  pages     = {4462-4484},
  volume    = {202},
  url       = {https://mlanthology.org/icml/2023/chen2023icml-restorationdegradation/}
}