Does Generation Require Memorization? Creative Diffusion Models Using Ambient Diffusion

Abstract

There is strong empirical evidence that the stateof-the-art diffusion modeling paradigm leads to models that memorize the training set, especially when the training set is small. Prior methods to mitigate the memorization problem often lead to decrease in image quality. Is it possible to obtain strong and creative generative models, i.e., models that achieve high generation quality and low memorization? Despite the current pessimistic landscape of results, we make significant progress in pushing the trade-off between fidelity and memorization. We first provide theoretical evidence that memorization in diffusion models is only necessary for denoising problems at low noise scales (usually used in generating high-frequency details). Using this theoretical insight, we propose a simple, principled method to train the diffusion models using noisy data at large noise scales. We show that our method significantly reduces memorization without decreasing the image quality, for both text-conditional and unconditional models and for a variety of data availability settings.

Cite

Text

Shah et al. "Does Generation Require Memorization? Creative Diffusion Models Using Ambient Diffusion." Proceedings of the 42nd International Conference on Machine Learning, 2025.

Markdown

[Shah et al. "Does Generation Require Memorization? Creative Diffusion Models Using Ambient Diffusion." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/shah2025icml-generation/)

BibTeX

@inproceedings{shah2025icml-generation,
  title     = {{Does Generation Require Memorization? Creative Diffusion Models Using Ambient Diffusion}},
  author    = {Shah, Kulin and Kalavasis, Alkis and Klivans, Adam and Daras, Giannis},
  booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
  year      = {2025},
  pages     = {54143-54166},
  volume    = {267},
  url       = {https://mlanthology.org/icml/2025/shah2025icml-generation/}
}