Understanding Cross-Domain Few-Shot Learning Based on Domain Similarity and Few-Shot Difficulty

Abstract

Cross-domain few-shot learning (CD-FSL) has drawn increasing attention for handling large differences between the source and target domains--an important concern in real-world scenarios. To overcome these large differences, recent works have considered exploiting small-scale unlabeled data from the target domain during the pre-training stage. This data enables self-supervised pre-training on the target domain, in addition to supervised pre-training on the source domain. In this paper, we empirically investigate which pre-training is preferred based on domain similarity and few-shot difficulty of the target domain. We discover that the performance gain of self-supervised pre-training over supervised pre-training becomes large when the target domain is dissimilar to the source domain, or the target domain itself has low few-shot difficulty. We further design two pre-training schemes, mixed-supervised and two-stage learning, that improve performance. In this light, we present six findings for CD-FSL, which are supported by extensive experiments and analyses on three source and eight target benchmark datasets with varying levels of domain similarity and few-shot difficulty. Our code is available at https://github.com/sungnyun/understanding-cdfsl.

Cite

Text

Oh et al. "Understanding Cross-Domain Few-Shot Learning Based on Domain Similarity and Few-Shot Difficulty." Neural Information Processing Systems, 2022.

Markdown

[Oh et al. "Understanding Cross-Domain Few-Shot Learning Based on Domain Similarity and Few-Shot Difficulty." Neural Information Processing Systems, 2022.](https://mlanthology.org/neurips/2022/oh2022neurips-understanding/)

BibTeX

@inproceedings{oh2022neurips-understanding,
  title     = {{Understanding Cross-Domain Few-Shot Learning Based on Domain Similarity and Few-Shot Difficulty}},
  author    = {Oh, Jaehoon and Kim, Sungnyun and Ho, Namgyu and Kim, Jin-Hwa and Song, Hwanjun and Yun, Se-Young},
  booktitle = {Neural Information Processing Systems},
  year      = {2022},
  url       = {https://mlanthology.org/neurips/2022/oh2022neurips-understanding/}
}