Diffusion Probabilistic Models Generalize When They Fail to Memorize

Abstract

In this work, we study the training of diffusion probabilistic models through a series of hypotheses and carefully designed experiments. We call our key finding the memorization-generalization dichotomy, and it asserts that generalization and memorization are mutually exclusive phenomena. This contrasts with the modern wisdom of supervised learning that deep neural networks exhibit "benign" overfitting and generalize well despite overfitting the data.

Cite

Text

Yoon et al. "Diffusion Probabilistic Models Generalize When They Fail to Memorize." ICML 2023 Workshops: SPIGM, 2023.

Markdown

[Yoon et al. "Diffusion Probabilistic Models Generalize When They Fail to Memorize." ICML 2023 Workshops: SPIGM, 2023.](https://mlanthology.org/icmlw/2023/yoon2023icmlw-diffusion/)

BibTeX

@inproceedings{yoon2023icmlw-diffusion,
  title     = {{Diffusion Probabilistic Models Generalize When They Fail to Memorize}},
  author    = {Yoon, TaeHo and Choi, Joo Young and Kwon, Sehyun and Ryu, Ernest K.},
  booktitle = {ICML 2023 Workshops: SPIGM},
  year      = {2023},
  url       = {https://mlanthology.org/icmlw/2023/yoon2023icmlw-diffusion/}
}