Optimally-Smooth Adaptive Boosting and Application to Agnostic Learning
Abstract
We construct a boosting algorithm, which is the first both smooth and adaptive booster. These two features make it possible to achieve performance improvement for many learning tasks whose solution use a boosting technique. Originally, the boosting approach was suggested for the standard PAC model; we analyze possible applications of boosting in the model of agnostic learning (which is “more realistic” than PAC). We derive a lower bound for the final error achievable by boosting in the agnostic model; we show that our algorithm actually achieves that accuracy (within a constant factor of 2): When the booster faces distribution D , its final error is bounded above by $ \frac{1} {{1/2 - \beta }}err_D (F) + \zeta $ , where err_ D ( F ) + β is an upper bound on the error of a hypothesis received from the (agnostic) weak learner when it faces distribution D and ζ is any real, so that the complexity of the boosting is polynomial in 1/ζ We note that the idea of applying boosting in the agnostic model was first suggested by Ben-David, Long and Mansour and the above accuracy is an exponential improvement w.r.t. β over their result $ (\frac{1} {{1/2 - \beta }}err_D (F)^{2(1/2 - \beta )^2 /ln(1/\beta - 1)} + \zeta ) $ . Eventually, we construct a boosting “tandem”, thus approaching in terms of O the lowest number of the boosting iterations possible, as well as in terms of Õ the best possible smoothness. This allows solving adaptively problems whose solution is based on smooth boosting (like noise tolerant boosting and DNF membership learning), preserving the original solution’s complexity.
Cite
Text
Gavinsky. "Optimally-Smooth Adaptive Boosting and Application to Agnostic Learning." International Conference on Algorithmic Learning Theory, 2002. doi:10.1007/3-540-36169-3_10Markdown
[Gavinsky. "Optimally-Smooth Adaptive Boosting and Application to Agnostic Learning." International Conference on Algorithmic Learning Theory, 2002.](https://mlanthology.org/alt/2002/gavinsky2002alt-optimallysmooth/) doi:10.1007/3-540-36169-3_10BibTeX
@inproceedings{gavinsky2002alt-optimallysmooth,
title = {{Optimally-Smooth Adaptive Boosting and Application to Agnostic Learning}},
author = {Gavinsky, Dmitry},
booktitle = {International Conference on Algorithmic Learning Theory},
year = {2002},
pages = {98-112},
doi = {10.1007/3-540-36169-3_10},
url = {https://mlanthology.org/alt/2002/gavinsky2002alt-optimallysmooth/}
}