Divide and Conquer: Two-Level Problem Remodeling for Large-Scale Few-Shot Learning
Abstract
Few-shot learning methods have achieved notable performance in recent years. However, few-shot learning in large-scale settings with hundreds of classes is still challenging. In this paper, we tackle the problems of large-scale few-shot learning by taking advantage of pre-trained foundation models. We recast the original problem in two levels with different granularity. At the coarse-grained level, we introduce a novel object recognition approach with robustness to sub-population shifts. At the fine-grained level, generative experts are designed for few-shot learning, specialized for different superclasses. A Bayesian schema is considered to combine coarse-grained information with fine-grained predictions in a winner-takes-all fashion. Extensive experiments on large-scale datasets and different architectures show that the proposed method is both effective and efficient besides its simplicity and natural problem remodeling. The code is publicly available at https://github.com/mohamadreza99/divide_and_conquer.
Cite
Text
Fereydooni et al. "Divide and Conquer: Two-Level Problem Remodeling for Large-Scale Few-Shot Learning." NeurIPS 2023 Workshops: R0-FoMo, 2023.Markdown
[Fereydooni et al. "Divide and Conquer: Two-Level Problem Remodeling for Large-Scale Few-Shot Learning." NeurIPS 2023 Workshops: R0-FoMo, 2023.](https://mlanthology.org/neuripsw/2023/fereydooni2023neuripsw-divide/)BibTeX
@inproceedings{fereydooni2023neuripsw-divide,
title = {{Divide and Conquer: Two-Level Problem Remodeling for Large-Scale Few-Shot Learning}},
author = {Fereydooni, Mohamadreza and Hasani, Hosein and Razghandi, Ali and Baghshah, Mahdieh Soleymani},
booktitle = {NeurIPS 2023 Workshops: R0-FoMo},
year = {2023},
url = {https://mlanthology.org/neuripsw/2023/fereydooni2023neuripsw-divide/}
}