A Dual Attention Network with Semantic Embedding for Few-Shot Learning
Abstract
Despite recent success of deep neural networks, it remains challenging to efficiently learn new visual concepts from limited training data. To address this problem, a prevailing strategy is to build a meta-learner that learns prior knowledge on learning from a small set of annotated data. However, most of existing meta-learning approaches rely on a global representation of images and a meta-learner with complex model structures, which are sensitive to background clutter and difficult to interpret. We propose a novel meta-learning method for few-shot classification based on two simple attention mechanisms: one is a spatial attention to localize relevant object regions and the other is a task attention to select similar training data for label prediction. We implement our method via a dual-attention network and design a semantic-aware meta-learning loss to train the meta-learner network in an end-to-end manner. We validate our model on three few-shot image classification datasets with extensive ablative study, and our approach shows competitive performances over these datasets with fewer parameters. For facilitating the future research, code and data split are available: https://github.com/tonysy/STANet-PyTorch
Cite
Text
Yan et al. "A Dual Attention Network with Semantic Embedding for Few-Shot Learning." AAAI Conference on Artificial Intelligence, 2019. doi:10.1609/AAAI.V33I01.33019079Markdown
[Yan et al. "A Dual Attention Network with Semantic Embedding for Few-Shot Learning." AAAI Conference on Artificial Intelligence, 2019.](https://mlanthology.org/aaai/2019/yan2019aaai-dual/) doi:10.1609/AAAI.V33I01.33019079BibTeX
@inproceedings{yan2019aaai-dual,
title = {{A Dual Attention Network with Semantic Embedding for Few-Shot Learning}},
author = {Yan, Shipeng and Zhang, Songyang and He, Xuming},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2019},
pages = {9079-9086},
doi = {10.1609/AAAI.V33I01.33019079},
url = {https://mlanthology.org/aaai/2019/yan2019aaai-dual/}
}