RePre: Improving Self-Supervised Vision Transformer with Reconstructive Pre-Training

Abstract

Recently, self-supervised vision transformers have attracted unprecedented attention for their impressive representation learning ability. However, the dominant method, contrastive learning, mainly relies on an instance discrimination pretext task, which learns a global understanding of the image. This paper incorporates local feature learning into self-supervised vision transformers via Reconstructive Pre-training (RePre). Our RePre extends contrastive frameworks by adding a branch for reconstructing raw image pixels in parallel with the existing contrastive objective. RePre equips with a lightweight convolution-based decoder that fuses the multi-hierarchy features from the transformer encoder. The multi-hierarchy features provide rich supervisions from low to high semantic information, crucial for our RePre. Our RePre brings decent improvements on various contrastive frameworks with different vision transformer architectures. Transfer performance in downstream tasks outperforms supervised pre-training and state-of-the-art (SOTA) self-supervised counterparts.

Cite

Text

Wang et al. "RePre: Improving Self-Supervised Vision Transformer with Reconstructive Pre-Training." International Joint Conference on Artificial Intelligence, 2022. doi:10.24963/IJCAI.2022/200

Markdown

[Wang et al. "RePre: Improving Self-Supervised Vision Transformer with Reconstructive Pre-Training." International Joint Conference on Artificial Intelligence, 2022.](https://mlanthology.org/ijcai/2022/wang2022ijcai-repre/) doi:10.24963/IJCAI.2022/200

BibTeX

@inproceedings{wang2022ijcai-repre,
  title     = {{RePre: Improving Self-Supervised Vision Transformer with Reconstructive Pre-Training}},
  author    = {Wang, Luya and Liang, Feng and Li, Yangguang and Zhang, Honggang and Ouyang, Wanli and Shao, Jing},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2022},
  pages     = {1437-1443},
  doi       = {10.24963/IJCAI.2022/200},
  url       = {https://mlanthology.org/ijcai/2022/wang2022ijcai-repre/}
}