On Robustness of On-Line Boosting - A Competitive Study
Abstract
On-line boosting is one of the most successful on-line algorithms and thus applied in many computer vision applications. However, even though boosting, in general, is well known to be susceptible to class-label noise, on-line boosting is mostly applied to self-learning applications such as visual object tracking, where label-noise is an inherent problem. This paper studies the robustness of on-line boosting. Since mainly the applied loss function determines the behavior of boosting, we propose an on-line version of GradientBoost, which allows us to plug in arbitrary loss-functions into the on-line learner. Hence, we can easily study the importance and the behavior of different loss-functions. We evaluate various on-line boosting algorithms in form of a competitive study on standard machine learning problems as well as on common computer vision applications such as tracking and autonomous training of object detectors. Our results show that using on-line Gradient-Boost with robust loss functions leads to superior results in all our experiments.
Cite
Text
Leistner et al. "On Robustness of On-Line Boosting - A Competitive Study." IEEE/CVF International Conference on Computer Vision Workshops, 2009. doi:10.1109/ICCVW.2009.5457451Markdown
[Leistner et al. "On Robustness of On-Line Boosting - A Competitive Study." IEEE/CVF International Conference on Computer Vision Workshops, 2009.](https://mlanthology.org/iccvw/2009/leistner2009iccvw-robustness/) doi:10.1109/ICCVW.2009.5457451BibTeX
@inproceedings{leistner2009iccvw-robustness,
title = {{On Robustness of On-Line Boosting - A Competitive Study}},
author = {Leistner, Christian and Saffari, Amir and Roth, Peter M. and Bischof, Horst},
booktitle = {IEEE/CVF International Conference on Computer Vision Workshops},
year = {2009},
pages = {1362-1369},
doi = {10.1109/ICCVW.2009.5457451},
url = {https://mlanthology.org/iccvw/2009/leistner2009iccvw-robustness/}
}