Direct Optimization of Margins Improves Generalization in Combined Classifiers
Abstract
Cumulative training margin dis(cid:173) tributions for AdaBoost versus our "Direct Optimization Of Margins" (DOOM) algorithm. The dark curve is AdaBoost, the light curve is DOOM. DOOM sacrifices significant training er(cid:173) ror for improved test error (hori(cid:173) zontal marks on margin= 0 line)_
Cite
Text
Mason et al. "Direct Optimization of Margins Improves Generalization in Combined Classifiers." Neural Information Processing Systems, 1998.Markdown
[Mason et al. "Direct Optimization of Margins Improves Generalization in Combined Classifiers." Neural Information Processing Systems, 1998.](https://mlanthology.org/neurips/1998/mason1998neurips-direct/)BibTeX
@inproceedings{mason1998neurips-direct,
title = {{Direct Optimization of Margins Improves Generalization in Combined Classifiers}},
author = {Mason, Llew and Bartlett, Peter L. and Baxter, Jonathan},
booktitle = {Neural Information Processing Systems},
year = {1998},
pages = {288-294},
url = {https://mlanthology.org/neurips/1998/mason1998neurips-direct/}
}