REDUCR: Robust Data Downsampling Using Class Priority Reweighting

Abstract

Modern machine learning models are becoming increasingly expensive to train for real-world image and text classification tasks, where massive web-scale data is collected in a streaming fashion. To reduce the training cost, online batch selection techniques have been developed to choose the most informative datapoints. However, many existing techniques are not robust to class imbalance and distributional shifts, and can suffer from poor worst-class generalization performance. This work introduces REDUCR, a robust and efficient data downsampling method that uses class priority reweighting. REDUCR reduces the training data while preserving worst-class generalization performance. REDUCR assigns priority weights to datapoints in a class-aware manner using an online learning algorithm. We demonstrate the data efficiency and robust performance of REDUCR on vision and text classification tasks. On web-scraped datasets with imbalanced class distributions, REDUCR significantly improves worst-class test accuracy (and average accuracy), surpassing state-of-the-art methods by around 15\%.

Cite

Text

Bankes et al. "REDUCR: Robust Data Downsampling Using Class Priority Reweighting." Neural Information Processing Systems, 2024. doi:10.52202/079017-2632

Markdown

[Bankes et al. "REDUCR: Robust Data Downsampling Using Class Priority Reweighting." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/bankes2024neurips-reducr/) doi:10.52202/079017-2632

BibTeX

@inproceedings{bankes2024neurips-reducr,
  title     = {{REDUCR: Robust Data Downsampling Using Class Priority Reweighting}},
  author    = {Bankes, William and Hughes, George and Bogunovic, Ilija and Wang, Zi},
  booktitle = {Neural Information Processing Systems},
  year      = {2024},
  doi       = {10.52202/079017-2632},
  url       = {https://mlanthology.org/neurips/2024/bankes2024neurips-reducr/}
}