Optimal Oracle Inequality for Aggregation of Classifiers Under Low Noise Condition

Abstract

We consider the problem of optimality, in a minimax sense, and adaptivity to the margin and to regularity in binary classification. We prove an oracle inequality, under the margin assumption (low noise condition), satisfied by an aggregation procedure which uses exponential weights. This oracle inequality has an optimal residual: (log M/n) κ/(2κ-1) where κ is the margin parameter, M the number of classifiers to aggregate and n the number of observations. We use this inequality first to construct minimax classifiers under margin and regularity assumptions and second to aggregate them to obtain a classifier which is adaptive both to the margin and regularity. Moreover, by aggregating plug-in classifiers (only log n), we provide an easily implementable classifier adaptive both to the margin and to regularity.

Cite

Text

Lecué. "Optimal Oracle Inequality for Aggregation of Classifiers Under Low Noise Condition." Annual Conference on Computational Learning Theory, 2006. doi:10.1007/11776420_28

Markdown

[Lecué. "Optimal Oracle Inequality for Aggregation of Classifiers Under Low Noise Condition." Annual Conference on Computational Learning Theory, 2006.](https://mlanthology.org/colt/2006/lecue2006colt-optimal/) doi:10.1007/11776420_28

BibTeX

@inproceedings{lecue2006colt-optimal,
  title     = {{Optimal Oracle Inequality for Aggregation of Classifiers Under Low Noise Condition}},
  author    = {Lecué, Guillaume},
  booktitle = {Annual Conference on Computational Learning Theory},
  year      = {2006},
  pages     = {364-378},
  doi       = {10.1007/11776420_28},
  url       = {https://mlanthology.org/colt/2006/lecue2006colt-optimal/}
}