Relative Deviation Margin Bounds

Abstract

We present a series of new and more favorable margin-based learning guarantees that depend on the empirical margin loss of a predictor. e give two types of learning bounds, in terms of either the Rademacher complexity or the empirical $\ell_\infty$-covering number of the hypothesis set used, both distribution-dependent and valid for general families. Furthermore, using our relative deviation margin bounds, we derive distribution-dependent generalization bounds for unbounded loss functions under the assumption of a finite moment. We also briefly highlight several applications of these bounds and discuss their connection with existing results.

Cite

Text

Cortes et al. "Relative Deviation Margin Bounds." International Conference on Machine Learning, 2021.

Markdown

[Cortes et al. "Relative Deviation Margin Bounds." International Conference on Machine Learning, 2021.](https://mlanthology.org/icml/2021/cortes2021icml-relative/)

BibTeX

@inproceedings{cortes2021icml-relative,
  title     = {{Relative Deviation Margin Bounds}},
  author    = {Cortes, Corinna and Mohri, Mehryar and Suresh, Ananda Theertha},
  booktitle = {International Conference on Machine Learning},
  year      = {2021},
  pages     = {2122-2131},
  volume    = {139},
  url       = {https://mlanthology.org/icml/2021/cortes2021icml-relative/}
}