Improved Margin Generalization Bounds for Voting Classifiers

Abstract

In this paper we establish a new margin-based generalization bound for voting classifiers, refining existing results and yielding tighter generalization guarantees for widely used boosting algorithms such as AdaBoost Freund and Schapire (1997). Furthermore, the new margin-based generalization bound enables the derivation of an optimal weak-to-strong learner: a Majority-of-3 large-margin classifiers with an expected error matching the theoretical lower bound. This result provides a more natural alternative to the Majority-of-5 algorithm by H\ogsgaard et al. (2024), and matches the Majority-of-3 result by Aden-Ali et al. (2024) for the realizable prediction model.

Cite

Text

Høgsgaard Møller and Green Larsen. "Improved Margin Generalization Bounds for Voting Classifiers." Proceedings of Thirty Eighth Conference on Learning Theory, 2025.

Markdown

[Høgsgaard Møller and Green Larsen. "Improved Margin Generalization Bounds for Voting Classifiers." Proceedings of Thirty Eighth Conference on Learning Theory, 2025.](https://mlanthology.org/colt/2025/hgsgaardmller2025colt-improved/)

BibTeX

@inproceedings{hgsgaardmller2025colt-improved,
  title     = {{Improved Margin Generalization Bounds for Voting Classifiers}},
  author    = {Høgsgaard Møller, Mikael and Green Larsen, Kasper},
  booktitle = {Proceedings of Thirty Eighth Conference on Learning Theory},
  year      = {2025},
  pages     = {2822-2855},
  volume    = {291},
  url       = {https://mlanthology.org/colt/2025/hgsgaardmller2025colt-improved/}
}