Learning with Noisy Class Labels for Instance Segmentation

Abstract

Instance segmentation has achieved siginificant progress in the presence of correctly annotated datasets. Yet, object classes in large-scale datasets are sometimes ambiguous, which easily causes confusion. In addition, limited experience and knowledge of annotators can also lead to mislabeled object classes. To solve this issue, a novel method is proposed in this paper, which uses different losses describing different roles of noisy class labels to enhance the learning. Specifically, in instance segmentation, noisy class labels play different roles in the foreground-background sub-task and the foreground-instance sub-task. Hence, on the one hand, the noise-robust loss (e.g., symmetric loss) is used to prevent incorrect gradient guidance for the foreground-instance sub-task. On the other hand, standard cross entropy loss is used to fully exploit correct gradient guidance for the foreground-background sub-task. Extensive experiments conducted with three popular datasets (i.e., Pascal VOC, Cityscapes and COCO) have demonstrated the effectiveness of our method in a wide range of noisy class labels scenarios.

Cite

Text

Yang et al. "Learning with Noisy Class Labels for Instance Segmentation." Proceedings of the European Conference on Computer Vision (ECCV), 2020. doi:10.1007/978-3-030-58568-6_3

Markdown

[Yang et al. "Learning with Noisy Class Labels for Instance Segmentation." Proceedings of the European Conference on Computer Vision (ECCV), 2020.](https://mlanthology.org/eccv/2020/yang2020eccv-learning/) doi:10.1007/978-3-030-58568-6_3

BibTeX

@inproceedings{yang2020eccv-learning,
  title     = {{Learning with Noisy Class Labels for Instance Segmentation}},
  author    = {Yang, Longrong and Meng, Fanman and Li, Hongliang and Wu, Qingbo and Cheng, Qishang},
  booktitle = {Proceedings of the European Conference on Computer Vision (ECCV)},
  year      = {2020},
  doi       = {10.1007/978-3-030-58568-6_3},
  url       = {https://mlanthology.org/eccv/2020/yang2020eccv-learning/}
}