Agnostic Pointwise-Competitive Selective Classification
Abstract
A pointwise competitive classifier from class F is required to classify identically to the best classifier in hindsight from F. For noisy, agnostic settings we present a strategy for learning pointwise-competitive classifiers from a finite training sample provided that the classifier can abstain from prediction at a certain region of its choice. For some interesting hypothesis classes and families of distributions, the measure of this rejected region is shown to be diminishing at rate β1 ċ O((polylog(m) ċ log(1/δ)/m)beta;2/2), with high probability, where m is the sample size, δ is the standard confidence parameter, and β1, β2 are smoothness parameters of a Bernstein type condition of the associated excess loss class (related to F and the 0/1 loss). Exact implementation of the proposed learning strategy is dependent on an ERM oracle that is hard to compute in the agnostic case. We thus consider a heuristic approximation procedure that is based on SVMs, and show empirically that this algorithm consistently outperforms a traditional rejection mechanism based on distance from decision boundary.
Cite
Text
Wiener and El-Yaniv. "Agnostic Pointwise-Competitive Selective Classification." Journal of Artificial Intelligence Research, 2015. doi:10.1613/JAIR.4439Markdown
[Wiener and El-Yaniv. "Agnostic Pointwise-Competitive Selective Classification." Journal of Artificial Intelligence Research, 2015.](https://mlanthology.org/jair/2015/wiener2015jair-agnostic/) doi:10.1613/JAIR.4439BibTeX
@article{wiener2015jair-agnostic,
title = {{Agnostic Pointwise-Competitive Selective Classification}},
author = {Wiener, Yair and El-Yaniv, Ran},
journal = {Journal of Artificial Intelligence Research},
year = {2015},
pages = {171-201},
doi = {10.1613/JAIR.4439},
volume = {52},
url = {https://mlanthology.org/jair/2015/wiener2015jair-agnostic/}
}