Token Boosting for Robust Self-Supervised Visual Transformer Pre-Training

Abstract

Learning with large-scale unlabeled data has become a powerful tool for pre-training Visual Transformers (VTs). However, prior works tend to overlook that, in real-world scenarios, the input data may be corrupted and unreliable. Pre-training VTs on such corrupted data can be challenging, especially when we pre-train via the masked autoencoding approach, where both the inputs and masked "ground truth" targets can potentially be unreliable in this case. To address this limitation, we introduce the Token Boosting Module (TBM) as a plug-and-play component for VTs that effectively allows the VT to learn to extract clean and robust features during masked autoencoding pre-training. We provide theoretical analysis to show how TBM improves model pre-training with more robust and generalizable representations, thus benefiting downstream tasks. We conduct extensive experiments to analyze TBM's effectiveness, and results on four corrupted datasets demonstrate that TBM consistently improves performance on downstream tasks.

Cite

Text

Li et al. "Token Boosting for Robust Self-Supervised Visual Transformer Pre-Training." Conference on Computer Vision and Pattern Recognition, 2023. doi:10.1109/CVPR52729.2023.02301

Markdown

[Li et al. "Token Boosting for Robust Self-Supervised Visual Transformer Pre-Training." Conference on Computer Vision and Pattern Recognition, 2023.](https://mlanthology.org/cvpr/2023/li2023cvpr-token/) doi:10.1109/CVPR52729.2023.02301

BibTeX

@inproceedings{li2023cvpr-token,
  title     = {{Token Boosting for Robust Self-Supervised Visual Transformer Pre-Training}},
  author    = {Li, Tianjiao and Foo, Lin Geng and Hu, Ping and Shang, Xindi and Rahmani, Hossein and Yuan, Zehuan and Liu, Jun},
  booktitle = {Conference on Computer Vision and Pattern Recognition},
  year      = {2023},
  pages     = {24027-24038},
  doi       = {10.1109/CVPR52729.2023.02301},
  url       = {https://mlanthology.org/cvpr/2023/li2023cvpr-token/}
}