Backbone Cannot Be Trained at Once: Rolling Back to Pre-Trained Network for Person Re-Identification
Abstract
In person re-identification (ReID) task, because of its shortage of trainable dataset, it is common to utilize fine-tuning method using a classification network pre-trained on a large dataset. However, it is relatively difficult to sufficiently finetune the low-level layers of the network due to the gradient vanishing problem. In this work, we propose a novel fine-tuning strategy that allows low-level layers to be sufficiently trained by rolling back the weights of high-level layers to their initial pre-trained weights. Our strategy alleviates the problem of gradient vanishing in low-level layers and robustly trains the low-level layers to fit the ReID dataset, thereby increasing the performance of ReID tasks. The improved performance of the proposed strategy is validated via several experiments. Furthermore, without any addons such as pose estimation or segmentation, our strategy exhibits state-of-the-art performance using only vanilla deep convolutional neural network architecture.
Cite
Text
Ro et al. "Backbone Cannot Be Trained at Once: Rolling Back to Pre-Trained Network for Person Re-Identification." AAAI Conference on Artificial Intelligence, 2019. doi:10.1609/AAAI.V33I01.33018859Markdown
[Ro et al. "Backbone Cannot Be Trained at Once: Rolling Back to Pre-Trained Network for Person Re-Identification." AAAI Conference on Artificial Intelligence, 2019.](https://mlanthology.org/aaai/2019/ro2019aaai-backbone/) doi:10.1609/AAAI.V33I01.33018859BibTeX
@inproceedings{ro2019aaai-backbone,
title = {{Backbone Cannot Be Trained at Once: Rolling Back to Pre-Trained Network for Person Re-Identification}},
author = {Ro, Youngmin and Choi, Jongwon and Jo, Dae Ung and Heo, Byeongho and Lim, Jongin and Choi, Jin Young},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2019},
pages = {8859-8867},
doi = {10.1609/AAAI.V33I01.33018859},
url = {https://mlanthology.org/aaai/2019/ro2019aaai-backbone/}
}