Breadcrumbs: Adversarial Class-Balanced Sampling for Long-Tailed Recognition
Abstract
The problem of long-tailed recognition, where the number of examples per class is highly unbalanced, is considered. While training with class-balanced sampling has been shown effective for this problem, it is known to over-fit to few-shot classes. It is hypothesized that this is due to the repeated sampling of examples and can be addressed by feature space augmentation. A new feature augmentation strategy, EMANATE, based on back-tracking of features across epochs during training, is proposed. It is shown that, unlike class-balanced sampling, this is an adversarial augmentation strategy. A new sampling procedure, Breadcrumb, is then introduced to implement adversarial class-balanced sampling without extra computation. Experiments on three popular long-tailed recognition datasets show that Breadcrumb training produces classifiers that outperform existing solutions to the problem.
Cite
Text
Liu et al. "Breadcrumbs: Adversarial Class-Balanced Sampling for Long-Tailed Recognition." Proceedings of the European Conference on Computer Vision (ECCV), 2022. doi:10.1007/978-3-031-20053-3_37Markdown
[Liu et al. "Breadcrumbs: Adversarial Class-Balanced Sampling for Long-Tailed Recognition." Proceedings of the European Conference on Computer Vision (ECCV), 2022.](https://mlanthology.org/eccv/2022/liu2022eccv-breadcrumbs/) doi:10.1007/978-3-031-20053-3_37BibTeX
@inproceedings{liu2022eccv-breadcrumbs,
title = {{Breadcrumbs: Adversarial Class-Balanced Sampling for Long-Tailed Recognition}},
author = {Liu, Bo and Li, Haoxiang and Kang, Hao and Hua, Gang and Vasconcelos, Nuno},
booktitle = {Proceedings of the European Conference on Computer Vision (ECCV)},
year = {2022},
doi = {10.1007/978-3-031-20053-3_37},
url = {https://mlanthology.org/eccv/2022/liu2022eccv-breadcrumbs/}
}