What’s the Score? Automated Denoising Score Matching for Nonlinear Diffusions
Abstract
Reversing a diffusion process by learning its score forms the heart of diffusion-based generative modeling and for estimating properties of scientific systems. The diffusion processes that are tractable center on linear processes with a Gaussian stationary distribution, limiting the kinds of models that can be built to those that target a Gaussian prior or more generally limits the kinds of problems that can be generically solved to those that have conditionally linear score functions. In this work, we introduce a family of tractable denoising score matching objectives, called local-DSM, built using local increments of the diffusion process. We show how local-DSM melded with Taylor expansions enables automated training and score estimation with nonlinear diffusion processes. To demonstrate these ideas, we use automated-DSM to train generative models using non-Gaussian priors on challenging low dimensional distributions and the CIFAR10 image dataset. Additionally, we use the automated-DSM to learn the scores for nonlinear processes studied in statistical physics.
Cite
Text
Singhal et al. "What’s the Score? Automated Denoising Score Matching for Nonlinear Diffusions." International Conference on Machine Learning, 2024.Markdown
[Singhal et al. "What’s the Score? Automated Denoising Score Matching for Nonlinear Diffusions." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/singhal2024icml-whats/)BibTeX
@inproceedings{singhal2024icml-whats,
title = {{What’s the Score? Automated Denoising Score Matching for Nonlinear Diffusions}},
author = {Singhal, Raghav and Goldstein, Mark and Ranganath, Rajesh},
booktitle = {International Conference on Machine Learning},
year = {2024},
pages = {45734-45758},
volume = {235},
url = {https://mlanthology.org/icml/2024/singhal2024icml-whats/}
}