Boosting Co-Teaching with Compression Regularization for Label Noise
Abstract
In this paper, we study the problem of learning image classification models in the presence of label noise. We revisit a simple compression regularization named Nested Dropout [22]. We find that Nested Dropout [22], though originally proposed to perform fast information retrieval and adaptive data compression, can properly regularize a neural network to combat label noise. Moreover, owing to its simplicity, it can be easily combined with Co-teaching [5] to further boost the performance.Our final model remains simple yet effective: it achieves comparable or even better performance than the state-of-the-art approaches on two real-world datasets with label noise which are Clothing1M [28] and ANIMAL-10N [24]. On Clothing1M [28], our approach obtains 74.9% accuracy which is slightly better than that of DivideMix [12]. On ANIMAL-10N [24], we achieve 84.1% accuracy while the best public result by PLC [30] is 83.4%. We hope that our simple approach can be served as a strong baseline for learning with label noise. Our implementation is available at https://github.com/yingyichen-cyy/Nested-Co-teaching.
Cite
Text
Chen et al. "Boosting Co-Teaching with Compression Regularization for Label Noise." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2021. doi:10.1109/CVPRW53098.2021.00302Markdown
[Chen et al. "Boosting Co-Teaching with Compression Regularization for Label Noise." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2021.](https://mlanthology.org/cvprw/2021/chen2021cvprw-boosting/) doi:10.1109/CVPRW53098.2021.00302BibTeX
@inproceedings{chen2021cvprw-boosting,
title = {{Boosting Co-Teaching with Compression Regularization for Label Noise}},
author = {Chen, Yingyi and Shen, Xi and Hu, Shell Xu and Suykens, Johan A. K.},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
year = {2021},
pages = {2688-2692},
doi = {10.1109/CVPRW53098.2021.00302},
url = {https://mlanthology.org/cvprw/2021/chen2021cvprw-boosting/}
}