Constant Rate Scheduling: A General Framework for Optimizing Diffusion Noise Schedule via Distributional Change

Abstract

We propose a general framework for optimizing noise schedules in diffusion models, applicable to both training and sampling. Our method enforces a constant rate of change in the probability distribution of diffused data throughout the diffusion process, where the rate of change is quantified using a user-defined discrepancy measure. We introduce three such measures, which can be flexibly selected or combined depending on the domain and model architecture. While our framework is inspired by theoretical insights, we do not aim to provide a complete theoretical justification of how distributional change affects sample quality. Instead, we focus on establishing a general-purpose scheduling framework and validating its empirical effectiveness. Through extensive experiments, we demonstrate that our approach consistently improves the performance of both pixel-space and latent-space diffusion models, across various datasets, samplers, and a wide range of number of function evaluations from 5 to 250. In particular, when applied to both training and sampling schedules, our method achieves a state-of-the-art FID score of 2.03 on LSUN Horse 256$\times$256, without compromising mode coverage.

Cite

Text

Okada et al. "Constant Rate Scheduling: A General Framework for Optimizing Diffusion Noise Schedule via Distributional Change." Transactions on Machine Learning Research, 2026.

Markdown

[Okada et al. "Constant Rate Scheduling: A General Framework for Optimizing Diffusion Noise Schedule via Distributional Change." Transactions on Machine Learning Research, 2026.](https://mlanthology.org/tmlr/2026/okada2026tmlr-constant/)

BibTeX

@article{okada2026tmlr-constant,
  title     = {{Constant Rate Scheduling: A General Framework for Optimizing Diffusion Noise Schedule via Distributional Change}},
  author    = {Okada, Shuntaro and Doi, Kenji and Yoshihashi, Ryota and Kataoka, Hirokatsu and Tanaka, Tomohiro},
  journal   = {Transactions on Machine Learning Research},
  year      = {2026},
  url       = {https://mlanthology.org/tmlr/2026/okada2026tmlr-constant/}
}