Contrastive Quantization with Code Memory for Unsupervised Image Retrieval

Abstract

The high efficiency in computation and storage makes hashing (including binary hashing and quantization) a common strategy in large-scale retrieval systems. To alleviate the reliance on expensive annotations, unsupervised deep hashing becomes an important research problem. This paper provides a novel solution to unsupervised deep quantization, namely Contrastive Quantization with Code Memory (MeCoQ). Different from existing reconstruction-based strategies, we learn unsupervised binary descriptors by contrastive learning, which can better capture discriminative visual semantics. Besides, we uncover that codeword diversity regularization is critical to prevent contrastive learning-based quantization from model degeneration. Moreover, we introduce a novel quantization code memory module that boosts contrastive learning with lower feature drift than conventional feature memories. Extensive experiments on benchmark datasets show that MeCoQ outperforms state-of-the-art methods. Code and configurations are publicly released.

Cite

Text

Wang et al. "Contrastive Quantization with Code Memory for Unsupervised Image Retrieval." AAAI Conference on Artificial Intelligence, 2022. doi:10.1609/AAAI.V36I3.20147

Markdown

[Wang et al. "Contrastive Quantization with Code Memory for Unsupervised Image Retrieval." AAAI Conference on Artificial Intelligence, 2022.](https://mlanthology.org/aaai/2022/wang2022aaai-contrastive/) doi:10.1609/AAAI.V36I3.20147

BibTeX

@inproceedings{wang2022aaai-contrastive,
  title     = {{Contrastive Quantization with Code Memory for Unsupervised Image Retrieval}},
  author    = {Wang, Jinpeng and Zeng, Ziyun and Chen, Bin and Dai, Tao and Xia, Shu-Tao},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2022},
  pages     = {2468-2476},
  doi       = {10.1609/AAAI.V36I3.20147},
  url       = {https://mlanthology.org/aaai/2022/wang2022aaai-contrastive/}
}