Self-Trained Centroid Classifiers for Semi-Supervised Cross-Domain Few-Shot Learning
Abstract
State-of-the-art cross-domain few-shot learning methods for image classification apply knowledge transfer by fine-tuning deep feature extractors obtained from source domains on the small labelled dataset available for the target domain, generally in conjunction with a simple centroid-based classification head. Semi-supervised learning during the meta-test phase is an obvious approach to incorporating unlabelled data into cross-domain few-shot learning, but semi-supervised methods designed for larger sets of labelled data than those available in few-shot learning appear to easily go astray when applied in this setting. We propose an efficient semi-supervised learning method that applies self-training to the classification head only and show that it can yield very consistent improvements in average performance in the Meta-Dataset benchmark for cross-domain few-shot learning when applied with contemporary methods utilising centroid-based classification.
Cite
Text
Wang et al. "Self-Trained Centroid Classifiers for Semi-Supervised Cross-Domain Few-Shot Learning." Proceedings of The 2nd Conference on Lifelong Learning Agents, 2023.Markdown
[Wang et al. "Self-Trained Centroid Classifiers for Semi-Supervised Cross-Domain Few-Shot Learning." Proceedings of The 2nd Conference on Lifelong Learning Agents, 2023.](https://mlanthology.org/collas/2023/wang2023collas-selftrained/)BibTeX
@inproceedings{wang2023collas-selftrained,
title = {{Self-Trained Centroid Classifiers for Semi-Supervised Cross-Domain Few-Shot Learning}},
author = {Wang, Hongyu and Frank, Eibe and Pfahringer, Bernhard and Holmes, Geoffrey},
booktitle = {Proceedings of The 2nd Conference on Lifelong Learning Agents},
year = {2023},
pages = {481-492},
volume = {232},
url = {https://mlanthology.org/collas/2023/wang2023collas-selftrained/}
}