Flatness Improves Backbone Generalisation in Few-Shot Classification

Abstract

Deployment of deep neural networks in real-world settings typically requires adaptation to new tasks with few examples. Few-shot classification (FSC) provides a solution to this problem by leveraging pre-trained backbones for fast adaptation to new classes. However approaches for multi-domain FSC typically result in complex pipelines aimed at information fusion and task-specific adaptation without consideration of the importance of backbone training. In this work we introduce an effective strategy for backbone training and selection in multi-domain FSC by utilizing flatness-aware training and fine-tuning. Our work is theoretically grounded and empirically performs on par or better than state-of-the-art methods despite being simpler. Further our results indicate that backbone training is crucial for good generalisation in FSC across different adaptation methods.

Cite

Text

Li et al. "Flatness Improves Backbone Generalisation in Few-Shot Classification." Winter Conference on Applications of Computer Vision, 2025.

Markdown

[Li et al. "Flatness Improves Backbone Generalisation in Few-Shot Classification." Winter Conference on Applications of Computer Vision, 2025.](https://mlanthology.org/wacv/2025/li2025wacv-flatness/)

BibTeX

@inproceedings{li2025wacv-flatness,
  title     = {{Flatness Improves Backbone Generalisation in Few-Shot Classification}},
  author    = {Li, Rui and Trapp, Martin and Klasson, Marcus and Solin, Arno},
  booktitle = {Winter Conference on Applications of Computer Vision},
  year      = {2025},
  pages     = {1072-1089},
  url       = {https://mlanthology.org/wacv/2025/li2025wacv-flatness/}
}