A Flexible Diffusion Model
Abstract
Denoising diffusion (score-based) generative models have become a popular choice for modeling complex data. Recently, a deep connection between forward-backward stochastic differential equations (SDEs) and diffusion-based models has been established, leading to the development of new SDE variants such as sub-VP and critically-damped Langevin. Despite the empirical success of some hand-crafted forward SDEs, many potentially promising forward SDEs remain unexplored. In this work, we propose a general framework for parameterizing diffusion models, particularly the spatial part of forward SDEs, by leveraging the symplectic and Riemannian geometry of the data manifold. We introduce a systematic formalism with theoretical guarantees and connect it with previous diffusion models. Finally, we demonstrate the theoretical advantages of our method from a variational optimization perspective. We present numerical experiments on synthetic datasets, MNIST and CIFAR10 to validate the effectiveness of our framework.
Cite
Text
Du et al. "A Flexible Diffusion Model." International Conference on Machine Learning, 2023.Markdown
[Du et al. "A Flexible Diffusion Model." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/du2023icml-flexible/)BibTeX
@inproceedings{du2023icml-flexible,
title = {{A Flexible Diffusion Model}},
author = {Du, Weitao and Zhang, He and Yang, Tao and Du, Yuanqi},
booktitle = {International Conference on Machine Learning},
year = {2023},
pages = {8678-8696},
volume = {202},
url = {https://mlanthology.org/icml/2023/du2023icml-flexible/}
}