Partner-Assisted Learning for Few-Shot Image Classification
Abstract
Few-shot Learning has been studied to mimic human visual capabilities and learn effective models without the need of exhaustive human annotation. Even though the idea of meta-learning for adaptation has dominated the few-shot learning methods, how to train a feature extractor is still a challenge. In this paper, we focus on the design of training strategy to obtain an elemental representation such that the prototype of each novel class can be estimated from a few labeled samples. We propose a two-stage training scheme, Partner-Assisted Learning (PAL), which first trains a partner encoder to model pair-wise similarities and extract features serving as soft-anchors, and then trains a main encoder by aligning its outputs with soft-anchors while attempting to maximize classification performance. Two alignment constraints from logit-level and feature-level are designed individually. For each few-shot task, we perform prototype classification. Our method consistently outperforms the state-of-the-art method on four benchmarks. Detailed ablation studies of PAL are provided to justify the selection of each component involved in training.
Cite
Text
Ma et al. "Partner-Assisted Learning for Few-Shot Image Classification." International Conference on Computer Vision, 2021. doi:10.1109/ICCV48922.2021.01040Markdown
[Ma et al. "Partner-Assisted Learning for Few-Shot Image Classification." International Conference on Computer Vision, 2021.](https://mlanthology.org/iccv/2021/ma2021iccv-partnerassisted/) doi:10.1109/ICCV48922.2021.01040BibTeX
@inproceedings{ma2021iccv-partnerassisted,
title = {{Partner-Assisted Learning for Few-Shot Image Classification}},
author = {Ma, Jiawei and Xie, Hanchen and Han, Guangxing and Chang, Shih-Fu and Galstyan, Aram and Abd-Almageed, Wael},
booktitle = {International Conference on Computer Vision},
year = {2021},
pages = {10573-10582},
doi = {10.1109/ICCV48922.2021.01040},
url = {https://mlanthology.org/iccv/2021/ma2021iccv-partnerassisted/}
}