Augmented Geometric Distillation for Data-Free Incremental Person ReID
Abstract
Incremental learning (IL) remains an open issue for Person Re-identification (ReID), where a ReID system is expected to preserve preceding knowledge while learning incrementally. However, due to the strict privacy licenses and the open-set retrieval setting, it is intractable to adapt existing class IL methods to ReID. In this work, we propose an Augmented Geometric Distillation (AGD) framework to tackle these issues. First, a general data-free incremental framework with dreaming memory is constructed to avoid privacy disclosure. On this basis, we reveal a "noisy distillation" problem stemming from the noise in dreaming memory, and further propose to augment distillation in a pairwise and cross-wise pattern over different views of memory to mitigate it. Second, for the open-set retrieval property, we propose to maintain feature space structure during evolving via a novel geometric way and preserve relationships between exemplars when representations drift. Extensive experiments demonstrate the superiority of our AGD to baseline with a margin of 6.0% mAP / 7.9% R@1 and it could be generalized to class IL. Code is available.
Cite
Text
Lu et al. "Augmented Geometric Distillation for Data-Free Incremental Person ReID." Conference on Computer Vision and Pattern Recognition, 2022. doi:10.1109/CVPR52688.2022.00718Markdown
[Lu et al. "Augmented Geometric Distillation for Data-Free Incremental Person ReID." Conference on Computer Vision and Pattern Recognition, 2022.](https://mlanthology.org/cvpr/2022/lu2022cvpr-augmented/) doi:10.1109/CVPR52688.2022.00718BibTeX
@inproceedings{lu2022cvpr-augmented,
title = {{Augmented Geometric Distillation for Data-Free Incremental Person ReID}},
author = {Lu, Yichen and Wang, Mei and Deng, Weihong},
booktitle = {Conference on Computer Vision and Pattern Recognition},
year = {2022},
pages = {7329-7338},
doi = {10.1109/CVPR52688.2022.00718},
url = {https://mlanthology.org/cvpr/2022/lu2022cvpr-augmented/}
}