Understanding and Mitigating Memorization in Generative Models via Sharpness of Probability Landscapes
Abstract
In this paper, we introduce a geometric framework to analyze memorization in diffusion models through the sharpness of the log probability density. We mathematically justify a previously proposed score-difference-based memorization metric by demonstrating its effectiveness in quantifying sharpness. Additionally, we propose a novel memorization metric that captures sharpness at the initial stage of image generation in latent diffusion models, offering early insights into potential memorization. Leveraging this metric, we develop a mitigation strategy that optimizes the initial noise of the generation process using a sharpness-aware regularization term. The code is publicly available at https://github.com/Dongjae0324/sharpness_memorization_diffusion.
Cite
Text
Jeon et al. "Understanding and Mitigating Memorization in Generative Models via Sharpness of Probability Landscapes." Proceedings of the 42nd International Conference on Machine Learning, 2025.Markdown
[Jeon et al. "Understanding and Mitigating Memorization in Generative Models via Sharpness of Probability Landscapes." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/jeon2025icml-understanding/)BibTeX
@inproceedings{jeon2025icml-understanding,
title = {{Understanding and Mitigating Memorization in Generative Models via Sharpness of Probability Landscapes}},
author = {Jeon, Dongjae and Kim, Dueun and No, Albert},
booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
year = {2025},
pages = {27091-27112},
volume = {267},
url = {https://mlanthology.org/icml/2025/jeon2025icml-understanding/}
}