Diffusion Tuning: Transferring Diffusion Models via Chain of Forgetting

Abstract

Diffusion models have significantly advanced the field of generative modeling. However, training a diffusion model is computationally expensive, creating a pressing need to adapt off-the-shelf diffusion models for downstream generation tasks. Current fine-tuning methods focus on parameter-efficient transfer learning but overlook the fundamental transfer characteristics of diffusion models. In this paper, we investigate the transferability of diffusion models and observe a monotonous chain of forgetting trend of transferability along the reverse process. Based on this observation and novel theoretical insights, we present Diff-Tuning, a frustratingly simple transfer approach that leverages the chain of forgetting tendency. Diff-Tuning encourages the fine-tuned model to retain the pre-trained knowledge at the end of the denoising chain close to the generated data while discarding the other noise side. We conduct comprehensive experiments to evaluate Diff-Tuning, including the transfer of pre-trained Diffusion Transformer models to eight downstream generations and the adaptation of Stable Diffusion to five control conditions with ControlNet. Diff-Tuning achieves a 24.6% improvement over standard fine-tuning and enhances the convergence speed of ControlNet by 24%. Notably, parameter-efficient transfer learning techniques for diffusion models can also benefit from Diff-Tuning. Code is available at this repository: https://github.com/thuml/Diffusion-Tuning.

Cite

Text

Zhong et al. "Diffusion Tuning: Transferring Diffusion Models via Chain of Forgetting." Neural Information Processing Systems, 2024. doi:10.52202/079017-3639

Markdown

[Zhong et al. "Diffusion Tuning: Transferring Diffusion Models via Chain of Forgetting." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/zhong2024neurips-diffusion/) doi:10.52202/079017-3639

BibTeX

@inproceedings{zhong2024neurips-diffusion,
  title     = {{Diffusion Tuning: Transferring Diffusion Models via Chain of Forgetting}},
  author    = {Zhong, Jincheng and Guo, Xingzhuo and Dong, Jiaxiang and Long, Mingsheng},
  booktitle = {Neural Information Processing Systems},
  year      = {2024},
  doi       = {10.52202/079017-3639},
  url       = {https://mlanthology.org/neurips/2024/zhong2024neurips-diffusion/}
}