Meta Evidential Transformer for Few-Shot Open-Set Recognition
Abstract
Few-shot open-set recognition (FSOSR) aims to detect instances from unseen classes by utilizing a small set of labeled instances from closed-set classes. Accurately rejecting instances from open-set classes in the few-shot setting is fundamentally more challenging due to the weaker supervised signals resulting from fewer labels. Transformer-based few-shot methods exploit attention mapping to achieve a consistent representation. However, the softmax-generated attention map normalizes all the instances that assign unnecessary high attentive weights to those instances not close to the closed-set classes that negatively impact the detection performance. In addition, open-set samples that are similar to a certain closed-set class also pose a significant challenge to most existing FSOSR models. To address these challenges, we propose a novel Meta Evidential Transformer (MET) based FSOSR model that uses an evidential open-set loss to learn more compact closed-set class representations by effectively leveraging similar closed-set classes. MET further integrates an evidence-to-variance ratio to detect fundamentally challenging tasks and uses an evidence-guided cross-attention mechanism to better separate the difficult open-set samples. Experiments on real-world datasets demonstrate consistent improvement over existing competitive methods in unseen class recognition without deteriorating closed-set performance.
Cite
Text
Sapkota et al. "Meta Evidential Transformer for Few-Shot Open-Set Recognition." International Conference on Machine Learning, 2024.Markdown
[Sapkota et al. "Meta Evidential Transformer for Few-Shot Open-Set Recognition." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/sapkota2024icml-meta/)BibTeX
@inproceedings{sapkota2024icml-meta,
title = {{Meta Evidential Transformer for Few-Shot Open-Set Recognition}},
author = {Sapkota, Hitesh and Neupane, Krishna Prasad and Yu, Qi},
booktitle = {International Conference on Machine Learning},
year = {2024},
pages = {43389-43406},
volume = {235},
url = {https://mlanthology.org/icml/2024/sapkota2024icml-meta/}
}