Model Tailor: Mitigating Catastrophic Forgetting in Multi-Modal Large Language Models
Abstract
Catastrophic forgetting emerges as a critical challenge when fine-tuning multi-modal large language models (MLLMs), where improving performance on unseen tasks often leads to a significant performance drop on the original tasks. This paper presents a comprehensive analysis of catastrophic forgetting in MLLMs and introduces a post-training adjustment method called Model Tailor. Our method primarily preserves the pre-trained parameters while replacing a small number ($\leq$ 10%) of fine-tuned parameters, maintaining $\sim$ 99% effectiveness on original tasks versus pre-training, and achieving $\sim$ 97% on new tasks compared to standard fine-tuning. Specifically, we derive a sparse mask to identify the model patch, based on a fusion strategy that integrates salience and sensitivity analysis. Subsequently, a compensation mechanism is introduced to decorate the patch, enhancing the model’s performance on both target and original tasks. Additionally, our method is adaptable to multi-task scenarios. Through extensive experiments on InstructBLIP and LLaVA-1.5 in both image captioning and visual question answering tasks, our approach demonstrates significant task adaptability while preserving inherent pre-trained capabilities.
Cite
Text
Zhu et al. "Model Tailor: Mitigating Catastrophic Forgetting in Multi-Modal Large Language Models." International Conference on Machine Learning, 2024.Markdown
[Zhu et al. "Model Tailor: Mitigating Catastrophic Forgetting in Multi-Modal Large Language Models." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/zhu2024icml-model/)BibTeX
@inproceedings{zhu2024icml-model,
title = {{Model Tailor: Mitigating Catastrophic Forgetting in Multi-Modal Large Language Models}},
author = {Zhu, Didi and Sun, Zhongyisun and Li, Zexi and Shen, Tao and Yan, Ke and Ding, Shouhong and Wu, Chao and Kuang, Kun},
booktitle = {International Conference on Machine Learning},
year = {2024},
pages = {62581-62598},
volume = {235},
url = {https://mlanthology.org/icml/2024/zhu2024icml-model/}
}