Neyman-Pearson Classification, Convexity and Stochastic Constraints
Abstract
Motivated by problems of anomaly detection, this paper implements the Neyman-Pearson paradigm to deal with asymmetric errors in binary classification with a convex loss φ. Given a finite collection of classifiers, we combine them and obtain a new classifier that satisfies simultaneously the two following properties with high probability: (i) its φ-type I error is below a pre-specified level and (ii), it has φ-type II error close to the minimum possible. The proposed classifier is obtained by minimizing an empirical convex objective with an empirical convex constraint. The novelty of the method is that the classifier output by this computationally feasible program is shown to satisfy the original constraint on type I error. New techniques to handle such problems are developed and they have consequences on chance constrained programming. We also evaluate the price to pay in terms of type II error for being conservative on type I error.
Cite
Text
Rigollet and Tong. "Neyman-Pearson Classification, Convexity and Stochastic Constraints." Journal of Machine Learning Research, 2011.Markdown
[Rigollet and Tong. "Neyman-Pearson Classification, Convexity and Stochastic Constraints." Journal of Machine Learning Research, 2011.](https://mlanthology.org/jmlr/2011/rigollet2011jmlr-neymanpearson/)BibTeX
@article{rigollet2011jmlr-neymanpearson,
title = {{Neyman-Pearson Classification, Convexity and Stochastic Constraints}},
author = {Rigollet, Philippe and Tong, Xin},
journal = {Journal of Machine Learning Research},
year = {2011},
pages = {2831-2855},
volume = {12},
url = {https://mlanthology.org/jmlr/2011/rigollet2011jmlr-neymanpearson/}
}