Boosting with Tempered Exponential Measures

Abstract

One of the most popular ML algorithms, AdaBoost, can bederived from the dual of a relative entropyminimization problem subject to the fact that the positive weightson the examples sum to one. Essentially, harder examples receive higher probabilities. We generalize this setup to the recently introduced *temperedexponential measure*s (TEMs) where normalization is enforced on a specific power of the measure and not the measure itself.TEMs are indexed by a parameter $t$ and generalize exponential families ($t=1$). Our algorithm, $t$-AdaBoost, recovers AdaBoost as a special case ($t=1$). We show that $t$-AdaBoost retains AdaBoost's celebrated exponential convergence rate when $t\in [0,1)$ while allowing a slight improvement of the rate's hidden constant compared to $t=1$. $t$-AdaBoost partially computes on a generalization of classical arithmetic over the reals and brings notable properties like guaranteed bounded leveraging coefficients for $t\in [0,1)$. From the loss that $t$-AdaBoost minimizes (a generalization of the exponential loss), we show how to derive a new family of *tempered* losses for the induction of domain-partitioning classifiers like decision trees. Crucially, strict properness is ensured for all while their boosting rates span the full known spectrum. Experiments using $t$-AdaBoost+trees display that significant leverage can be achieved by tuning $t$.

Cite

Text

Nock et al. "Boosting with Tempered Exponential Measures." Neural Information Processing Systems, 2023.

Markdown

[Nock et al. "Boosting with Tempered Exponential Measures." Neural Information Processing Systems, 2023.](https://mlanthology.org/neurips/2023/nock2023neurips-boosting/)

BibTeX

@inproceedings{nock2023neurips-boosting,
  title     = {{Boosting with Tempered Exponential Measures}},
  author    = {Nock, Richard and Amid, Ehsan and Warmuth, Manfred},
  booktitle = {Neural Information Processing Systems},
  year      = {2023},
  url       = {https://mlanthology.org/neurips/2023/nock2023neurips-boosting/}
}