TransTailor: Pruning the Pre-Trained Model for Improved Transfer Learning
Abstract
The increasing of pre-trained models has significantly facilitated the performance on limited data tasks with transfer learning. However, progress on transfer learning mainly focuses on optimizing the weights of pre-trained models, which ignores the structure mismatch between the model and the target task. This paper aims to improve the transfer performance from another angle - in addition to tuning the weights, we tune the structure of pre-trained models, in order to better match the target task. To this end, we propose TransTailor, targeting at pruning the pre-trained model for improved transfer learning. Different from traditional pruning pipelines, we prune and fine-tune the pre-trained model according to the target-aware weight importance, generating an optimal sub-model tailored for a specific target task. In this way, we transfer a more suitable sub-structure that can be applied during fine-tuning to benefit the final performance. Extensive experiments on multiple pre-trained models and datasets demonstrate that TransTailor outperforms the traditional pruning methods and achieves competitive or even better performance than other state-of-the-art transfer learning methods while using a smaller model. Notably, on the Stanford Dogs dataset, TransTailor can achieve 2.7% accuracy improvement over other transfer methods with 20% fewer FLOPs.
Cite
Text
Liu et al. "TransTailor: Pruning the Pre-Trained Model for Improved Transfer Learning." AAAI Conference on Artificial Intelligence, 2021. doi:10.1609/AAAI.V35I10.17046Markdown
[Liu et al. "TransTailor: Pruning the Pre-Trained Model for Improved Transfer Learning." AAAI Conference on Artificial Intelligence, 2021.](https://mlanthology.org/aaai/2021/liu2021aaai-transtailor/) doi:10.1609/AAAI.V35I10.17046BibTeX
@inproceedings{liu2021aaai-transtailor,
title = {{TransTailor: Pruning the Pre-Trained Model for Improved Transfer Learning}},
author = {Liu, Bingyan and Cai, Yifeng and Guo, Yao and Chen, Xiangqun},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2021},
pages = {8627-8634},
doi = {10.1609/AAAI.V35I10.17046},
url = {https://mlanthology.org/aaai/2021/liu2021aaai-transtailor/}
}