Boosted Dyadic Kernel Discriminants
Abstract
We introduce a novel learning algorithm for binary classi(cid:12)cation with hyperplane discriminants based on pairs of training points from opposite classes (dyadic hypercuts). This algorithm is further extended to nonlinear discriminants using kernel functions satisfy- ing Mercer’s conditions. An ensemble of simple dyadic hypercuts is learned incrementally by means of a con(cid:12)dence-rated version of Ad- aBoost, which provides a sound strategy for searching through the (cid:12)nite set of hypercut hypotheses. In experiments with real-world datasets from the UCI repository, the generalization performance of the hypercut classi(cid:12)ers was found to be comparable to that of SVMs and k-NN classi(cid:12)ers. Furthermore, the computational cost of classi(cid:12)cation (at run time) was found to be similar to, or bet- ter than, that of SVM. Similarly to SVMs, boosted dyadic kernel discriminants tend to maximize the margin (via AdaBoost). In contrast to SVMs, however, we o(cid:11)er an on-line and incremental learning machine for building kernel discriminants whose complex- ity (number of kernel evaluations) can be directly controlled (traded o(cid:11) for accuracy).
Cite
Text
Moghaddam and Shakhnarovich. "Boosted Dyadic Kernel Discriminants." Neural Information Processing Systems, 2002.Markdown
[Moghaddam and Shakhnarovich. "Boosted Dyadic Kernel Discriminants." Neural Information Processing Systems, 2002.](https://mlanthology.org/neurips/2002/moghaddam2002neurips-boosted/)BibTeX
@inproceedings{moghaddam2002neurips-boosted,
title = {{Boosted Dyadic Kernel Discriminants}},
author = {Moghaddam, Baback and Shakhnarovich, Gregory},
booktitle = {Neural Information Processing Systems},
year = {2002},
pages = {761-768},
url = {https://mlanthology.org/neurips/2002/moghaddam2002neurips-boosted/}
}