A Vector-Contraction Inequality for Rademacher Complexities
Abstract
The contraction inequality for Rademacher averages is extended to Lipschitz functions with vector-valued domains, and it is also shown that in the bounding expression the Rademacher variables can be replaced by arbitrary iid symmetric and sub-gaussian variables. Example applications are given for multi-category learning, K-means clustering and learning-to-learn.
Cite
Text
Maurer. "A Vector-Contraction Inequality for Rademacher Complexities." International Conference on Algorithmic Learning Theory, 2016. doi:10.1007/978-3-319-46379-7_1Markdown
[Maurer. "A Vector-Contraction Inequality for Rademacher Complexities." International Conference on Algorithmic Learning Theory, 2016.](https://mlanthology.org/alt/2016/maurer2016alt-vectorcontraction/) doi:10.1007/978-3-319-46379-7_1BibTeX
@inproceedings{maurer2016alt-vectorcontraction,
title = {{A Vector-Contraction Inequality for Rademacher Complexities}},
author = {Maurer, Andreas},
booktitle = {International Conference on Algorithmic Learning Theory},
year = {2016},
pages = {3-17},
doi = {10.1007/978-3-319-46379-7_1},
url = {https://mlanthology.org/alt/2016/maurer2016alt-vectorcontraction/}
}