TaylorBoost: First and Second-Order Boosting Algorithms with Explicit Margin Control
Abstract
A new family of boosting algorithms, denoted Taylor-Boost, is proposed. It supports any combination of loss function and first or second order optimization, and includes classical algorithms such as AdaBoost, Gradient-Boost, or LogitBoost as special cases. Its restriction to the set of canonical losses makes it possible to have boosting algorithms with explicit margin control. A new large family of losses with this property, based on the set of cumulative distributions of zero mean random variables, is then proposed. A novel loss function in this family, the Laplace loss, is finally derived. The combination of this loss and second order TaylorBoost produces a boosting algorithm with explicit margin control.
Cite
Text
Saberian et al. "TaylorBoost: First and Second-Order Boosting Algorithms with Explicit Margin Control." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2011. doi:10.1109/CVPR.2011.5995605Markdown
[Saberian et al. "TaylorBoost: First and Second-Order Boosting Algorithms with Explicit Margin Control." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2011.](https://mlanthology.org/cvpr/2011/saberian2011cvpr-taylorboost/) doi:10.1109/CVPR.2011.5995605BibTeX
@inproceedings{saberian2011cvpr-taylorboost,
title = {{TaylorBoost: First and Second-Order Boosting Algorithms with Explicit Margin Control}},
author = {Saberian, Mohammad J. and Masnadi-Shirazi, Hamed and Vasconcelos, Nuno},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition},
year = {2011},
pages = {2929-2934},
doi = {10.1109/CVPR.2011.5995605},
url = {https://mlanthology.org/cvpr/2011/saberian2011cvpr-taylorboost/}
}