AWE: Adaptive Weight-Space Ensembling for Few-Shot Fine-Tuning
Abstract
In this paper, we introduce a new transfer learning approach called Adaptive Weight-space Ensembling (AWE) that effectively adapts large pre-trained models for downstream tasks with limited fine-tuning data. Traditional transfer learning methods often struggle or become infeasible in scenarios with only a few examples per class, particularly when a validation set is needed. AWE overcomes these challenges by adapting the weight-space ensembling technique, originally developed for large-scale data, to suit few-shot settings without requiring a validation set. By identifying patterns in oracle weight-space ensembling, we create an adaptive ensembling method that can be easily implemented in real-world applications. Our approach outperforms existing state-of-the-art methods by more than 2\% on average in standard few-shot setting benchmarks.
Cite
Text
Gagnon-Audet et al. "AWE: Adaptive Weight-Space Ensembling for Few-Shot Fine-Tuning." ICLR 2023 Workshops: ME-FoMo, 2023.Markdown
[Gagnon-Audet et al. "AWE: Adaptive Weight-Space Ensembling for Few-Shot Fine-Tuning." ICLR 2023 Workshops: ME-FoMo, 2023.](https://mlanthology.org/iclrw/2023/gagnonaudet2023iclrw-awe/)BibTeX
@inproceedings{gagnonaudet2023iclrw-awe,
title = {{AWE: Adaptive Weight-Space Ensembling for Few-Shot Fine-Tuning}},
author = {Gagnon-Audet, Jean-Christophe and Monti, Ricardo Pio and Schwab, David J.},
booktitle = {ICLR 2023 Workshops: ME-FoMo},
year = {2023},
url = {https://mlanthology.org/iclrw/2023/gagnonaudet2023iclrw-awe/}
}