Self-Attention Message Passing for Contrastive Few-Shot Learning
Abstract
Humans have a unique ability to learn new representations from just a handful of examples with little to no supervision. Deep learning models, however, require an abundance of data and supervision to perform at a satisfactory level. Unsupervised few-shot learning (U-FSL) is the pursuit of bridging this gap between machines and humans. Inspired by the capacity of graph neural networks (GNNs) in discovering complex inter-sample relationships, we propose a novel self-attention based message passing contrastive learning approach (coined as SAMP-CLR) for U-FSL pre-training. We also propose an optimal transport (OT) based fine-tuning strategy (we call OpT-Tune) to efficiently induce task awareness into our novel end-to-end unsupervised few-shot classification framework (SAMPTransfer). Our extensive experimental results corroborate the efficacy of SAMPTransfer in a variety of downstream few-shot classification scenarios, setting a new state-of-the-art for U-FSL on both miniImageNet and tieredImageNet benchmarks, offering up to 7%+ and 5%+ improvements, respectively. Our further investigations also confirm that SAMPTransfer remains on-par with some supervised baselines on miniImageNet and outperforms all existing U-FSL baselines in a challenging cross-domain scenario.
Cite
Text
Shirekar et al. "Self-Attention Message Passing for Contrastive Few-Shot Learning." Winter Conference on Applications of Computer Vision, 2023.Markdown
[Shirekar et al. "Self-Attention Message Passing for Contrastive Few-Shot Learning." Winter Conference on Applications of Computer Vision, 2023.](https://mlanthology.org/wacv/2023/shirekar2023wacv-selfattention/)BibTeX
@inproceedings{shirekar2023wacv-selfattention,
title = {{Self-Attention Message Passing for Contrastive Few-Shot Learning}},
author = {Shirekar, Ojas Kishorkumar and Singh, Anuj and Jamali-Rad, Hadi},
booktitle = {Winter Conference on Applications of Computer Vision},
year = {2023},
pages = {5426-5436},
url = {https://mlanthology.org/wacv/2023/shirekar2023wacv-selfattention/}
}