PLReMix: Combating Noisy Labels with Pseudo-Label Relaxed Contrastive Representation Learning

Abstract

Recently the usage of Contrastive Representation Learning (CRL) as a pre-training technique improves the performance of learning with noisy labels (LNL) methods. However instead of pre-training when trivially combining CRL loss with LNL methods as an end-to-end framework the empirical experiments show severe degeneration of the performance. We verify through experiments that this issue is caused by optimization conflicts of losses and propose an end-to-end PLReMix framework by introducing a Pseudo-Label Relaxed (PLR) contrastive loss. This PLR loss constructs a reliable negative set of each sample by filtering out its inappropriate negative pairs alleviating the loss conflicts by trivially combining these losses. The proposed PLR loss is pluggable and we have integrated it into other LNL methods observing their improved performance. Furthermore a two-dimensional Gaussian Mixture Model is adopted to distinguish clean and noisy samples by leveraging semantic information and model outputs simultaneously. Experiments on multiple benchmark datasets demonstrate the effectiveness of the proposed method. Codes will be available.

Cite

Text

Liu et al. "PLReMix: Combating Noisy Labels with Pseudo-Label Relaxed Contrastive Representation Learning." Winter Conference on Applications of Computer Vision, 2025.

Markdown

[Liu et al. "PLReMix: Combating Noisy Labels with Pseudo-Label Relaxed Contrastive Representation Learning." Winter Conference on Applications of Computer Vision, 2025.](https://mlanthology.org/wacv/2025/liu2025wacv-plremix/)

BibTeX

@inproceedings{liu2025wacv-plremix,
  title     = {{PLReMix: Combating Noisy Labels with Pseudo-Label Relaxed Contrastive Representation Learning}},
  author    = {Liu, Xiaoyu and Zhou, Beitong and Yue, Zuogong and Cheng, Cheng},
  booktitle = {Winter Conference on Applications of Computer Vision},
  year      = {2025},
  pages     = {6517-6527},
  url       = {https://mlanthology.org/wacv/2025/liu2025wacv-plremix/}
}