SaRA: High-Efficient Diffusion Model Fine-Tuning with Progressive Sparse Low-Rank Adaptation
Abstract
The development of diffusion models has led to significant progress in image and video generation tasks, with pre-trained models like the Stable Diffusion series playing a crucial role. However, a key challenge remains in downstream task applications: how to effectively and efficiently adapt pre-trained diffusion models to new tasks. Inspired by model pruning which lightens large pre-trained models by removing unimportant parameters, we propose a novel model fine-tuning method to make full use of these ineffective parameters and enable the pre-trained model with new task-specified capabilities. In this work, we first investigate the importance of parameters in pre-trained diffusion models and discover that parameters with the smallest absolute values do not contribute to the generation process due to training instabilities. Based on this observation, we propose a fine-tuning method termed SaRA that re-utilizes these temporarily ineffective parameters, equating to optimizing a sparse weight matrix to learn the task-specific knowledge. To mitigate potential overfitting, we propose a nuclear-norm-based low-rank sparse training scheme for efficient fine-tuning. Furthermore, we design a new progressive parameter adjustment strategy to make full use of the finetuned parameters. Finally, we propose a novel unstructural backpropagation strategy, which significantly reduces memory costs during fine-tuning. Our method enhances the generative capabilities of pre-trained models in downstream applications and outperforms existing fine-tuning methods in maintaining model's generalization ability. Source code is available at https://sjtuplayer.github.io/projects/SaRA.
Cite
Text
Hu et al. "SaRA: High-Efficient Diffusion Model Fine-Tuning with Progressive Sparse Low-Rank Adaptation." International Conference on Learning Representations, 2025.Markdown
[Hu et al. "SaRA: High-Efficient Diffusion Model Fine-Tuning with Progressive Sparse Low-Rank Adaptation." International Conference on Learning Representations, 2025.](https://mlanthology.org/iclr/2025/hu2025iclr-sara/)BibTeX
@inproceedings{hu2025iclr-sara,
title = {{SaRA: High-Efficient Diffusion Model Fine-Tuning with Progressive Sparse Low-Rank Adaptation}},
author = {Hu, Teng and Zhang, Jiangning and Yi, Ran and Huang, Hongrui and Wang, Yabiao and Ma, Lizhuang},
booktitle = {International Conference on Learning Representations},
year = {2025},
url = {https://mlanthology.org/iclr/2025/hu2025iclr-sara/}
}