Relative Position and mAP Networks in Few-Shot Learning for Image Classification
Abstract
Few-shot learning is an important research topic in image classification, which aims to train robust classifiers to categorize images coming from new classes where only a few labeled samples are available. Recently, metric learning based methods have achieved promising performance, and in those methods a distance metric is learned to directly compare query images against training samples. In this work, we consider finer information from image feature maps and propose a new approach. Specifically, we newly develop Relative Position Network (RPN) based on the attention mechanism to compare different pairs of activation cells from each query and training images, which captures their intrinsic correspondences. Moreover, we introduce Relative Map Network (RMN) to learn a distance metric based on the attention maps obtained from RPN, which better measures the similarity between query and training images. Extensive experiments demonstrate the effectiveness of our proposed method. Our codes will be released at https://github.com/chrisyxue/RMN-RPN-for-FSL.
Cite
Text
Xue et al. "Relative Position and mAP Networks in Few-Shot Learning for Image Classification." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2020. doi:10.1109/CVPRW50498.2020.00474Markdown
[Xue et al. "Relative Position and mAP Networks in Few-Shot Learning for Image Classification." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2020.](https://mlanthology.org/cvprw/2020/xue2020cvprw-relative/) doi:10.1109/CVPRW50498.2020.00474BibTeX
@inproceedings{xue2020cvprw-relative,
title = {{Relative Position and mAP Networks in Few-Shot Learning for Image Classification}},
author = {Xue, Zhiyu and Xie, Zhenshan and Xing, Zheng and Duan, Lixin},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
year = {2020},
pages = {4032-4036},
doi = {10.1109/CVPRW50498.2020.00474},
url = {https://mlanthology.org/cvprw/2020/xue2020cvprw-relative/}
}