Max-Margin Inspired Per-Sample Re-Weighting for Robust Deep Learning
Abstract
We design simple, explicit, and flexible per-sample re-weighting schemes for learning deep neural networks in a variety of tasks that require robustness of some form. These tasks include classification with label imbalance, domain adaptation, and tabular representation learning. Our re-weighting schemes are simple and can be used in combination with any popular optimization algorithms such as SGD, Adam. Our techniques are inspired by max-margin learning, and rely on mirror maps such as log-barrier and negative entropy, which have been shown to perform max-margin classification. Empirically, we demonstrate the superiority of our approach on all of the aforementioned tasks. Our techniques provide state-of-the-art results in tasks involving tabular representation learning and domain adaptation.
Cite
Text
Kumar et al. "Max-Margin Inspired Per-Sample Re-Weighting for Robust Deep Learning." ICLR 2023 Workshops: Trustworthy_ML, 2023.Markdown
[Kumar et al. "Max-Margin Inspired Per-Sample Re-Weighting for Robust Deep Learning." ICLR 2023 Workshops: Trustworthy_ML, 2023.](https://mlanthology.org/iclrw/2023/kumar2023iclrw-maxmargin/)BibTeX
@inproceedings{kumar2023iclrw-maxmargin,
title = {{Max-Margin Inspired Per-Sample Re-Weighting for Robust Deep Learning}},
author = {Kumar, Ramnath and Majmundar, Kushal Alpesh and Nagaraj, Dheeraj Mysore and Suggala, Arun},
booktitle = {ICLR 2023 Workshops: Trustworthy_ML},
year = {2023},
url = {https://mlanthology.org/iclrw/2023/kumar2023iclrw-maxmargin/}
}