Unsupervised Representation for Semantic Segmentation by Implicit Cycle-Attention Contrastive Learning
Abstract
We study the unsupervised representation learning for the semantic segmentation task. Different from previous works that aim at providing unsupervised pre-trained backbones for segmentation models which need further supervised fine-tune, here, we focus on providing representation that is only trained by unsupervised methods. This means models need to directly generate pixel-level, linearly separable semantic results. We first explore and present two factors that have significant effects on segmentation under the contrastive learning framework: 1) the difficulty and diversity of the positive contrastive pairs, 2) the balance of global and local features. With the intention of optimizing these factors, we propose the cycle-attention contrastive learning (CACL). CACL makes use of semantic continuity of video frames, adopting unsupervised cycle-consistent attention mechanism to implicitly conduct contrastive learning with difficult, global-local-balanced positive pixel pairs. Compared with baseline model MoCo-v2 and other unsupervised methods, CACL demonstrates consistently superior performance on PASCAL VOC (+4.5 mIoU) and Cityscapes (+4.5 mIoU) datasets.
Cite
Text
Pang et al. "Unsupervised Representation for Semantic Segmentation by Implicit Cycle-Attention Contrastive Learning." AAAI Conference on Artificial Intelligence, 2022. doi:10.1609/AAAI.V36I2.20100Markdown
[Pang et al. "Unsupervised Representation for Semantic Segmentation by Implicit Cycle-Attention Contrastive Learning." AAAI Conference on Artificial Intelligence, 2022.](https://mlanthology.org/aaai/2022/pang2022aaai-unsupervised/) doi:10.1609/AAAI.V36I2.20100BibTeX
@inproceedings{pang2022aaai-unsupervised,
title = {{Unsupervised Representation for Semantic Segmentation by Implicit Cycle-Attention Contrastive Learning}},
author = {Pang, Bo and Li, Yizhuo and Zhang, Yifan and Peng, Gao and Tang, Jiajun and Zha, Kaiwen and Li, Jiefeng and Lu, Cewu},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2022},
pages = {2044-2052},
doi = {10.1609/AAAI.V36I2.20100},
url = {https://mlanthology.org/aaai/2022/pang2022aaai-unsupervised/}
}