Relaxed Clipping: A Global Training Method for Robust Regression and Classification

Abstract

Robust regression and classification are often thought to require non-convex loss functions that prevent scalable, global training. However, such a view neglects the possibility of reformulated training methods that can yield practically solvable alternatives. A natural way to make a loss function more robust to outliers is to truncate loss values that exceed a maximum threshold. We demonstrate that a relaxation of this form of ``loss clipping'' can be made globally solvable and applicable to any standard loss while guaranteeing robustness against outliers. We present a generic procedure that can be applied to standard loss functions and demonstrate improved robustness in regression and classification problems.

Cite

Text

Yang et al. "Relaxed Clipping: A Global Training Method for Robust Regression and Classification." Neural Information Processing Systems, 2010.

Markdown

[Yang et al. "Relaxed Clipping: A Global Training Method for Robust Regression and Classification." Neural Information Processing Systems, 2010.](https://mlanthology.org/neurips/2010/yang2010neurips-relaxed/)

BibTeX

@inproceedings{yang2010neurips-relaxed,
  title     = {{Relaxed Clipping: A Global Training Method for Robust Regression and Classification}},
  author    = {Yang, Min and Xu, Linli and White, Martha and Schuurmans, Dale and Yu, Yao-liang},
  booktitle = {Neural Information Processing Systems},
  year      = {2010},
  pages     = {2532-2540},
  url       = {https://mlanthology.org/neurips/2010/yang2010neurips-relaxed/}
}