Label Noise: Ignorance Is Bliss
Abstract
We establish a new theoretical framework for learning under multi-class, instance-dependent label noise. This framework casts learning with label noise as a form of domain adaptation, in particular, domain adaptation under posterior drift. We introduce the concept of \emph{relative signal strength} (RSS), a pointwise measure that quantifies the transferability from noisy to clean posterior. Using RSS, we establish nearly matching upper and lower bounds on the excess risk. Our theoretical findings support the simple \emph{Noise Ignorant Empirical Risk Minimization (NI-ERM)} principle, which minimizes empirical risk while ignoring label noise. Finally, we translate this theoretical insight into practice: by using NI-ERM to fit a linear classifier on top of a self-supervised feature extractor, we achieve state-of-the-art performance on the CIFAR-N data challenge.
Cite
Text
Zhu et al. "Label Noise: Ignorance Is Bliss." Neural Information Processing Systems, 2024. doi:10.52202/079017-3701Markdown
[Zhu et al. "Label Noise: Ignorance Is Bliss." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/zhu2024neurips-label/) doi:10.52202/079017-3701BibTeX
@inproceedings{zhu2024neurips-label,
title = {{Label Noise: Ignorance Is Bliss}},
author = {Zhu, Yilun and Zhang, Jianxin and Gangrade, Aditya and Scott, Clayton},
booktitle = {Neural Information Processing Systems},
year = {2024},
doi = {10.52202/079017-3701},
url = {https://mlanthology.org/neurips/2024/zhu2024neurips-label/}
}