Few-Shot Adaptation of Pre-Trained Networks for Domain Shift

Abstract

Deep networks are prone to performance degradation when there is a domain shift between the source (training) data and target (test) data. Recent test-time adaptation methods update batch normalization layers of pre-trained source models deployed in new target environments with streaming data. Although these methods can adapt on-the-fly without first collecting a large target domain dataset, their performance is dependent on streaming conditions such as mini-batch size and class-distribution which can be unpredictable in practice. In this work, we propose a framework for few-shot domain adaptation to address the practical challenges of data-efficient adaptation. Specifically, we propose a constrained optimization of feature normalization statistics in pre-trained source models supervised by a small target domain support set. Our method is easy to implement and improves source model performance with as little as one sample per class for classification tasks. Extensive experiments on 5 cross-domain classification and 4 semantic segmentation datasets show that our proposed method achieves more accurate and reliable performance than test-time adaptation, while not being constrained by streaming conditions.

Cite

Text

Zhang et al. "Few-Shot Adaptation of Pre-Trained Networks for Domain Shift." International Joint Conference on Artificial Intelligence, 2022. doi:10.24963/IJCAI.2022/232

Markdown

[Zhang et al. "Few-Shot Adaptation of Pre-Trained Networks for Domain Shift." International Joint Conference on Artificial Intelligence, 2022.](https://mlanthology.org/ijcai/2022/zhang2022ijcai-few-a/) doi:10.24963/IJCAI.2022/232

BibTeX

@inproceedings{zhang2022ijcai-few-a,
  title     = {{Few-Shot Adaptation of Pre-Trained Networks for Domain Shift}},
  author    = {Zhang, Wenyu and Shen, Li and Zhang, Wanyue and Foo, Chuan-Sheng},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2022},
  pages     = {1665-1671},
  doi       = {10.24963/IJCAI.2022/232},
  url       = {https://mlanthology.org/ijcai/2022/zhang2022ijcai-few-a/}
}