Fast Excess Risk Rates via Offset Rademacher Complexity

Abstract

Based on the offset Rademacher complexity, this work outlines a systematical framework for deriving sharp excess risk bounds in statistical learning without Bernstein condition. In addition to recovering fast rates in a unified way for some parametric and nonparametric supervised learning models with minimum identifiability assumptions, we also obtain new and improved results for LAD (sparse) linear regression and deep logistic regression with deep ReLU neural networks, respectively.

Cite

Text

Duan et al. "Fast Excess Risk Rates via Offset Rademacher Complexity." International Conference on Machine Learning, 2023.

Markdown

[Duan et al. "Fast Excess Risk Rates via Offset Rademacher Complexity." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/duan2023icml-fast/)

BibTeX

@inproceedings{duan2023icml-fast,
  title     = {{Fast Excess Risk Rates via Offset Rademacher Complexity}},
  author    = {Duan, Chenguang and Jiao, Yuling and Kang, Lican and Lu, Xiliang and Yang, Jerry Zhijian},
  booktitle = {International Conference on Machine Learning},
  year      = {2023},
  pages     = {8697-8716},
  volume    = {202},
  url       = {https://mlanthology.org/icml/2023/duan2023icml-fast/}
}