A Unifying Perspective on Model Reuse: From Small to Large Pre-Trained Models
Abstract
Machine learning has rapidly progressed, resulting in a vast repository of both general and specialized models that address diverse practical needs. Reusing pre-trained models (PTMs) from public model zoos has emerged as an effective strategy, leveraging rich model resources and reshaping traditional machine learning workflows. These PTMs encapsulate valuable inductive biases beneficial for downstream tasks. Well-designed reuse strategies enable models to be adapted beyond their original scope, enhancing both performance and efficiency in target machine learning systems. This survey offers a unifying perspective on model reuse, establishing connections across various domains and presenting a novel taxonomy that encompasses the full lifecycle of PTM utilization---including selection from model zoos, adaptation techniques, and related areas such as model representation learning. We delve into the similarities and distinctions between reusing specialized and general PTMs, providing insights into their respective advantages and limitations. Furthermore, we discuss key challenges, emerging trends, and future directions in model reuse, aiming to guide research and practice in the era of large-scale pre-trained models. A comprehensive list of papers about model reuse is available at https://github.com/LAMDA-Model-Reuse/Awesome-Model-Reuse.
Cite
Text
Zhou and Ye. "A Unifying Perspective on Model Reuse: From Small to Large Pre-Trained Models." International Joint Conference on Artificial Intelligence, 2025. doi:10.24963/IJCAI.2025/1201Markdown
[Zhou and Ye. "A Unifying Perspective on Model Reuse: From Small to Large Pre-Trained Models." International Joint Conference on Artificial Intelligence, 2025.](https://mlanthology.org/ijcai/2025/zhou2025ijcai-unifying/) doi:10.24963/IJCAI.2025/1201BibTeX
@inproceedings{zhou2025ijcai-unifying,
title = {{A Unifying Perspective on Model Reuse: From Small to Large Pre-Trained Models}},
author = {Zhou, Da-Wei and Ye, Han-Jia},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2025},
pages = {10826-10835},
doi = {10.24963/IJCAI.2025/1201},
url = {https://mlanthology.org/ijcai/2025/zhou2025ijcai-unifying/}
}