Improved Analysis of Score-Based Generative Modeling: User-Friendly Bounds Under Minimal Smoothness Assumptions
Abstract
We give an improved theoretical analysis of score-based generative modeling. Under a score estimate with small $L^2$ error (averaged across timesteps), we provide efficient convergence guarantees for any data distribution with second-order moment, by either employing early stopping or assuming smoothness condition on the score function of the data distribution. Our result does not rely on any log-concavity or functional inequality assumption and has a logarithmic dependence on the smoothness. In particular, we show that under only a finite second moment condition, approximating the following in reverse KL divergence in $\epsilon$-accuracy can be done in $\tilde O\left(\frac{d \log (1/\delta)}{\epsilon}\right)$ steps: 1) the variance-$\delta$ Gaussian perturbation of any data distribution; 2) data distributions with $1/\delta$-smooth score functions. Our analysis also provides a quantitative comparison between different discrete approximations and may guide the choice of discretization points in practice.
Cite
Text
Chen et al. "Improved Analysis of Score-Based Generative Modeling: User-Friendly Bounds Under Minimal Smoothness Assumptions." International Conference on Machine Learning, 2023.Markdown
[Chen et al. "Improved Analysis of Score-Based Generative Modeling: User-Friendly Bounds Under Minimal Smoothness Assumptions." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/chen2023icml-improved/)BibTeX
@inproceedings{chen2023icml-improved,
title = {{Improved Analysis of Score-Based Generative Modeling: User-Friendly Bounds Under Minimal Smoothness Assumptions}},
author = {Chen, Hongrui and Lee, Holden and Lu, Jianfeng},
booktitle = {International Conference on Machine Learning},
year = {2023},
pages = {4735-4763},
volume = {202},
url = {https://mlanthology.org/icml/2023/chen2023icml-improved/}
}