Twin Contrastive Learning with Noisy Labels
Abstract
Learning from noisy data is a challenging task that significantly degenerates the model performance. In this paper, we present TCL, a novel twin contrastive learning model to learn robust representations and handle noisy labels for classification. Specifically, we construct a Gaussian mixture model (GMM) over the representations by injecting the supervised model predictions into GMM to link label-free latent variables in GMM with label-noisy annotations. Then, TCL detects the examples with wrong labels as the out-of-distribution examples by another two-component GMM, taking into account the data distribution. We further propose a cross-supervision with an entropy regularization loss that bootstraps the true targets from model predictions to handle the noisy labels. As a result, TCL can learn discriminative representations aligned with estimated labels through mixup and contrastive learning. Extensive experimental results on several standard benchmarks and real-world datasets demonstrate the superior performance of TCL. In particular, TCL achieves 7.5% improvements on CIFAR-10 with 90% noisy label---an extremely noisy scenario. The source code is available at https://github.com/Hzzone/TCL.
Cite
Text
Huang et al. "Twin Contrastive Learning with Noisy Labels." Conference on Computer Vision and Pattern Recognition, 2023. doi:10.1109/CVPR52729.2023.01122Markdown
[Huang et al. "Twin Contrastive Learning with Noisy Labels." Conference on Computer Vision and Pattern Recognition, 2023.](https://mlanthology.org/cvpr/2023/huang2023cvpr-twin/) doi:10.1109/CVPR52729.2023.01122BibTeX
@inproceedings{huang2023cvpr-twin,
title = {{Twin Contrastive Learning with Noisy Labels}},
author = {Huang, Zhizhong and Zhang, Junping and Shan, Hongming},
booktitle = {Conference on Computer Vision and Pattern Recognition},
year = {2023},
pages = {11661-11670},
doi = {10.1109/CVPR52729.2023.01122},
url = {https://mlanthology.org/cvpr/2023/huang2023cvpr-twin/}
}