The Dynamics of AdaBoost: Cyclic Behavior and Convergence of Margins

Abstract

In order to study the convergence properties of the AdaBoost algorithm, we reduce AdaBoost to a nonlinear iterated map and study the evolution of its weight vectors. This dynamical systems approach allows us to understand AdaBoost's convergence properties completely in certain cases; for these cases we find stable cycles, allowing us to explicitly solve for AdaBoost's output. Using this unusual technique, we are able to show that AdaBoost does not always converge to a maximum margin combined classifier, answering an open question. In addition, we show that "non-optimal" AdaBoost (where the weak learning algorithm does not necessarily choose the best weak classifier at each iteration) may fail to converge to a maximum margin classifier, even if "optimal" AdaBoost produces a maximum margin. Also, we show that if AdaBoost cycles, it cycles among "support vectors", i.e., examples that achieve the same smallest margin.

Cite

Text

Rudin et al. "The Dynamics of AdaBoost: Cyclic Behavior and Convergence of Margins." Journal of Machine Learning Research, 2004.

Markdown

[Rudin et al. "The Dynamics of AdaBoost: Cyclic Behavior and Convergence of Margins." Journal of Machine Learning Research, 2004.](https://mlanthology.org/jmlr/2004/rudin2004jmlr-dynamics/)

BibTeX

@article{rudin2004jmlr-dynamics,
  title     = {{The Dynamics of AdaBoost: Cyclic Behavior and Convergence of Margins}},
  author    = {Rudin, Cynthia and Daubechies, Ingrid and Schapire, Robert E.},
  journal   = {Journal of Machine Learning Research},
  year      = {2004},
  pages     = {1557-1595},
  volume    = {5},
  url       = {https://mlanthology.org/jmlr/2004/rudin2004jmlr-dynamics/}
}