InsCLR: Improving Instance Retrieval with Self-Supervision
Abstract
This work aims at improving instance retrieval with self-supervision. We find that fine-tuning using the recently developed self-supervised learning (SSL) methods, such as SimCLR and MoCo, fails to improve the performance of instance retrieval. In this work, we identify that the learnt representations for instance retrieval should be invariant to large variations in viewpoint and background etc., whereas self-augmented positives applied by the current SSL methods can not provide strong enough signals for learning robust instance-level representations. To overcome this problem, we propose InsCLR, a new SSL method that builds on the instance-level contrast, to learn the intra-class invariance by dynamically mining meaningful pseudo positive samples from both mini-batches and a memory bank during training. Extensive experiments demonstrate that InsCLR achieves similar or even better performance than the state-of-the-art SSL methods on instance retrieval. Code is available at https://github.com/zeludeng/insclr.
Cite
Text
Deng et al. "InsCLR: Improving Instance Retrieval with Self-Supervision." AAAI Conference on Artificial Intelligence, 2022. doi:10.1609/AAAI.V36I1.19930Markdown
[Deng et al. "InsCLR: Improving Instance Retrieval with Self-Supervision." AAAI Conference on Artificial Intelligence, 2022.](https://mlanthology.org/aaai/2022/deng2022aaai-insclr/) doi:10.1609/AAAI.V36I1.19930BibTeX
@inproceedings{deng2022aaai-insclr,
title = {{InsCLR: Improving Instance Retrieval with Self-Supervision}},
author = {Deng, Zelu and Zhong, Yujie and Guo, Sheng and Huang, Weilin},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2022},
pages = {516-524},
doi = {10.1609/AAAI.V36I1.19930},
url = {https://mlanthology.org/aaai/2022/deng2022aaai-insclr/}
}