Multiclass Learning with Margin: Exponential Rates with No Bias-Variance Trade-Off

Abstract

We study the behavior of error bounds for multiclass classification under suitable margin conditions. For a wide variety of methods we prove that the classification error under a hard-margin condition decreases exponentially fast without any bias-variance trade-off. Different convergence rates can be obtained in correspondence of different margin assumptions. With a self-contained and instructive analysis we are able to generalize known results from the binary to the multiclass setting.

Cite

Text

Vigogna et al. "Multiclass Learning with Margin: Exponential Rates with No Bias-Variance Trade-Off." International Conference on Machine Learning, 2022.

Markdown

[Vigogna et al. "Multiclass Learning with Margin: Exponential Rates with No Bias-Variance Trade-Off." International Conference on Machine Learning, 2022.](https://mlanthology.org/icml/2022/vigogna2022icml-multiclass/)

BibTeX

@inproceedings{vigogna2022icml-multiclass,
  title     = {{Multiclass Learning with Margin: Exponential Rates with No Bias-Variance Trade-Off}},
  author    = {Vigogna, Stefano and Meanti, Giacomo and De Vito, Ernesto and Rosasco, Lorenzo},
  booktitle = {International Conference on Machine Learning},
  year      = {2022},
  pages     = {22260-22269},
  volume    = {162},
  url       = {https://mlanthology.org/icml/2022/vigogna2022icml-multiclass/}
}