Pruning Feature Extractor Stacking for Cross-Domain Few-Shot Learning

Abstract

Combining knowledge from source domains to learn efficiently from a few labelled instances in a target domain is a transfer learning problem known as cross-domain few-shot learning (CDFSL). Feature extractor stacking (FES) is a state-of-the-art CDFSL method that maintains a collection of source domain feature extractors instead of a single universal extractor. FES uses stacked generalisation to build an ensemble from extractor snapshots saved during target domain fine-tuning. It outperforms several contemporary universal model-based CDFSL methods in the Meta-Dataset benchmark. However, it incurs higher storage cost because it saves a snapshot for every fine-tuning iteration for every extractor. In this work, we propose a bidirectional snapshot selection strategy for FES, leveraging its cross-validation process and the ordered nature of its snapshots, and demonstrate that a 95% snapshot reduction can be achieved while retaining the same level of accuracy.

Cite

Text

Wang et al. "Pruning Feature Extractor Stacking for Cross-Domain Few-Shot Learning." Transactions on Machine Learning Research, 2025.

Markdown

[Wang et al. "Pruning Feature Extractor Stacking for Cross-Domain Few-Shot Learning." Transactions on Machine Learning Research, 2025.](https://mlanthology.org/tmlr/2025/wang2025tmlr-pruning/)

BibTeX

@article{wang2025tmlr-pruning,
  title     = {{Pruning Feature Extractor Stacking for Cross-Domain Few-Shot Learning}},
  author    = {Wang, Hongyu and Frank, Eibe and Pfahringer, Bernhard and Holmes, Geoff},
  journal   = {Transactions on Machine Learning Research},
  year      = {2025},
  url       = {https://mlanthology.org/tmlr/2025/wang2025tmlr-pruning/}
}