Distribution-Balanced Loss for Multi-Label Classification in Long-Tailed Datasets
Abstract
We present a new loss function called Distribution-Balanced Loss for the multi-label recognition problems that exhibit long-tailed class distributions. Compared to conventional single-label classification problem, multi-label recognition problems are often more challenging due to two significant issues, namely the co-occurrence of labels and the dominance of negative labels (when treated as multiple binary classification problems). The Distribution-Balanced Loss tackles these issues through two key modifications to the standard binary cross-entropy loss: 1) a new way to re-balance the weights that takes into account the impact caused by label co-occurrence, and 2) a negative tolerant regularization to mitigate the over-suppression of negative labels. Experiments on both Pascal VOC and COCO show that the models trained with this new loss function achieve significant performance gains over existing methods.
Cite
Text
Wu et al. "Distribution-Balanced Loss for Multi-Label Classification in Long-Tailed Datasets." Proceedings of the European Conference on Computer Vision (ECCV), 2020. doi:10.1007/978-3-030-58548-8_10Markdown
[Wu et al. "Distribution-Balanced Loss for Multi-Label Classification in Long-Tailed Datasets." Proceedings of the European Conference on Computer Vision (ECCV), 2020.](https://mlanthology.org/eccv/2020/wu2020eccv-distributionbalanced/) doi:10.1007/978-3-030-58548-8_10BibTeX
@inproceedings{wu2020eccv-distributionbalanced,
title = {{Distribution-Balanced Loss for Multi-Label Classification in Long-Tailed Datasets}},
author = {Wu, Tong and Huang, Qingqiu and Liu, Ziwei and Wang, Yu and Lin, Dahua},
booktitle = {Proceedings of the European Conference on Computer Vision (ECCV)},
year = {2020},
doi = {10.1007/978-3-030-58548-8_10},
url = {https://mlanthology.org/eccv/2020/wu2020eccv-distributionbalanced/}
}