UNIP: Rethinking Pre-Trained Attention Patterns for Infrared Semantic Segmentation

Abstract

Pre-training techniques significantly enhance the performance of semantic segmentation tasks with limited training data. However, the efficacy under a large domain gap between pre-training (e.g. RGB) and fine-tuning (e.g. infrared) remains underexplored. In this study, we first benchmark the infrared semantic segmentation performance of various pre-training methods and reveal several phenomena distinct from the RGB domain. Next, our layerwise analysis of pre-trained attention maps uncovers that: (1) There are three typical attention patterns (local, hybrid, and global); (2) Pre-training tasks notably influence pattern distribution across layers; (3) The hybrid pattern is crucial for semantic segmentation as it attends to both nearby and foreground elements; (4) The texture bias impedes model generalization in infrared tasks. Building on these insights, we propose UNIP, a UNified Infrared Pre-training framework, to enhance the pre-trained model performance. This framework uses the hybrid-attention distillation NMI-HAD as the pre-training target, a large-scale mixed dataset InfMix for pre-training, and a last-layer feature pyramid network LL-FPN for fine-tuning. Experimental results show that UNIP outperforms various pre-training methods by up to 13.5% in average mIoU on three infrared segmentation tasks, evaluated using fine-tuning and linear probing metrics. UNIP-S achieves performance on par with MAE-L while requiring only 1/10 of the computational cost. Furthermore, with fewer parameters, UNIP significantly surpasses state-of-the-art (SOTA) infrared or RGB segmentation methods and demonstrates the broad potential for application in other modalities, such as RGB and depth. Our code is available at https://github.com/casiatao/UNIP.

Cite

Text

Zhang et al. "UNIP: Rethinking Pre-Trained Attention Patterns for Infrared Semantic Segmentation." International Conference on Learning Representations, 2025.

Markdown

[Zhang et al. "UNIP: Rethinking Pre-Trained Attention Patterns for Infrared Semantic Segmentation." International Conference on Learning Representations, 2025.](https://mlanthology.org/iclr/2025/zhang2025iclr-unip/)

BibTeX

@inproceedings{zhang2025iclr-unip,
  title     = {{UNIP: Rethinking Pre-Trained Attention Patterns for Infrared Semantic Segmentation}},
  author    = {Zhang, Tao and Wen, Jinyong and Chen, Zhen and Ding, Kun and Xiang, Shiming and Pan, Chunhong},
  booktitle = {International Conference on Learning Representations},
  year      = {2025},
  url       = {https://mlanthology.org/iclr/2025/zhang2025iclr-unip/}
}