Localization of VC Classes: Beyond Local Rademacher Complexities
Abstract
In statistical learning the excess risk of empirical risk minimization (ERM) is controlled by \(\left( \frac{\text COMP_n(\mathcal F)}n\right) ^{\alpha }\), where n is a size of a learning sample, \(\text COMP_n(\mathcal F)\) is a complexity term associated with a given class \(\mathcal F\) and \(\alpha \in [\frac{1}2, 1]\) interpolates between slow and fast learning rates. In this paper we introduce an alternative localization approach for binary classification that leads to a novel complexity measure: fixed points of the local empirical entropy. We show that this complexity measure gives a tight control over \(\text COMP_n(\mathcal F)\) in the upper bounds under bounded noise. Our results are accompanied by a novel minimax lower bound that involves the same quantity. In particular, we practically answer the question of optimality of ERM under bounded noise for general VC classes.
Cite
Text
Zhivotovskiy and Hanneke. "Localization of VC Classes: Beyond Local Rademacher Complexities." International Conference on Algorithmic Learning Theory, 2016. doi:10.1007/978-3-319-46379-7_2Markdown
[Zhivotovskiy and Hanneke. "Localization of VC Classes: Beyond Local Rademacher Complexities." International Conference on Algorithmic Learning Theory, 2016.](https://mlanthology.org/alt/2016/zhivotovskiy2016alt-localization/) doi:10.1007/978-3-319-46379-7_2BibTeX
@inproceedings{zhivotovskiy2016alt-localization,
title = {{Localization of VC Classes: Beyond Local Rademacher Complexities}},
author = {Zhivotovskiy, Nikita and Hanneke, Steve},
booktitle = {International Conference on Algorithmic Learning Theory},
year = {2016},
pages = {18-33},
doi = {10.1007/978-3-319-46379-7_2},
url = {https://mlanthology.org/alt/2016/zhivotovskiy2016alt-localization/}
}