Adaptive Stochastic Coefficients for Accelerating Diffusion Sampling

Abstract

Diffusion-based generative processes, formulated as differential equation solving, frequently balance computational speed with sample quality. Our theoretical investigation of ODE- and SDE-based solvers reveals complementary weaknesses: ODE solvers accumulate irreducible gradient error along deterministic trajectories, while SDE methods suffer from amplified discretization errors when the step budget is limited. Building upon this insight, we introduce AdaSDE, a novel single-step SDE solver that aims to unify the efficiency of ODEs with the error resilience of SDEs. Specifically, we introduce a single per-step learnable coefficient, estimated via lightweight distillation, which dynamically regulates the error correction strength to accelerate diffusion sampling. Notably, our framework can be integrated with existing solvers to enhance their capabilities. Extensive experiments demonstrate state-of-the-art performance: at 5 NFE, AdaSDE achieves FID scores of $4.18$ on CIFAR-10, $8.05$ on FFHQ and $6.96$ on LSUN Bedroom. Codes are available https://github.com/WLU-wry02/AdaSDE.

Cite

Text

Wang et al. "Adaptive Stochastic Coefficients for Accelerating Diffusion Sampling." Advances in Neural Information Processing Systems, 2025.

Markdown

[Wang et al. "Adaptive Stochastic Coefficients for Accelerating Diffusion Sampling." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/wang2025neurips-adaptive-a/)

BibTeX

@inproceedings{wang2025neurips-adaptive-a,
  title     = {{Adaptive Stochastic Coefficients for Accelerating Diffusion Sampling}},
  author    = {Wang, Ruoyu and Zhu, Beier and Li, Junzhi and Yuan, Liangyu and Zhang, Chi},
  booktitle = {Advances in Neural Information Processing Systems},
  year      = {2025},
  url       = {https://mlanthology.org/neurips/2025/wang2025neurips-adaptive-a/}
}