Enhancing Few-Shot Image Classification with Unlabelled Examples
Abstract
We develop a transductive meta-learning method that uses unlabelled instances to improve few-shot image classification performance. Our approach combines a regularized Mahalanobis-distance-based soft k-means clustering procedure with a modified state of the art neural adaptive feature extractor to achieve improved test-time classification accuracy using unlabelled data. We evaluate our method on transductive few-shot learning tasks, in which the goal is to jointly predict labels for query (test) examples given a set of support (training) examples. We achieve state of the art performance on the Meta-Dataset, mini-ImageNet and tiered-ImageNet benchmarks. All trained models and code have been made publicly available at github.com/plai-group/simple-cnaps.
Cite
Text
Bateni et al. "Enhancing Few-Shot Image Classification with Unlabelled Examples." Winter Conference on Applications of Computer Vision, 2022.Markdown
[Bateni et al. "Enhancing Few-Shot Image Classification with Unlabelled Examples." Winter Conference on Applications of Computer Vision, 2022.](https://mlanthology.org/wacv/2022/bateni2022wacv-enhancing/)BibTeX
@inproceedings{bateni2022wacv-enhancing,
title = {{Enhancing Few-Shot Image Classification with Unlabelled Examples}},
author = {Bateni, Peyman and Barber, Jarred and van de Meent, Jan-Willem and Wood, Frank},
booktitle = {Winter Conference on Applications of Computer Vision},
year = {2022},
pages = {2796-2805},
url = {https://mlanthology.org/wacv/2022/bateni2022wacv-enhancing/}
}