Concentration Inequalities Under Sub-Gaussian and Sub-Exponential Conditions
Abstract
We prove analogues of the popular bounded difference inequality (also called McDiarmid's inequality) for functions of independent random variables under sub-gaussian and sub-exponential conditions. Applied to vector-valued concentration and the method of Rademacher complexities these inequalities allow an easy extension of uniform convergence results for PCA and linear regression to the case potentially unbounded input- and output variables.
Cite
Text
Maurer and Pontil. "Concentration Inequalities Under Sub-Gaussian and Sub-Exponential Conditions." Neural Information Processing Systems, 2021.Markdown
[Maurer and Pontil. "Concentration Inequalities Under Sub-Gaussian and Sub-Exponential Conditions." Neural Information Processing Systems, 2021.](https://mlanthology.org/neurips/2021/maurer2021neurips-concentration/)BibTeX
@inproceedings{maurer2021neurips-concentration,
title = {{Concentration Inequalities Under Sub-Gaussian and Sub-Exponential Conditions}},
author = {Maurer, Andreas and Pontil, Massimiliano},
booktitle = {Neural Information Processing Systems},
year = {2021},
url = {https://mlanthology.org/neurips/2021/maurer2021neurips-concentration/}
}