Size-Independent Sample Complexity of Neural Networks

Abstract

We study the sample complexity of learning neural networks by providing new bounds on their Rademacher complexity, assuming norm constraints on the parameter matrix of each layer. Compared to previous work, these complexity bounds have improved dependence on the network depth and, under some additional assumptions, are fully independent of the network size (both depth and width). These results are derived using some novel techniques, which may be of independent interest.

Cite

Text

Golowich et al. "Size-Independent Sample Complexity of Neural Networks." Annual Conference on Computational Learning Theory, 2018. doi:10.1093/IMAIAI/IAZ007

Markdown

[Golowich et al. "Size-Independent Sample Complexity of Neural Networks." Annual Conference on Computational Learning Theory, 2018.](https://mlanthology.org/colt/2018/golowich2018colt-size/) doi:10.1093/IMAIAI/IAZ007

BibTeX

@inproceedings{golowich2018colt-size,
  title     = {{Size-Independent Sample Complexity of Neural Networks}},
  author    = {Golowich, Noah and Rakhlin, Alexander and Shamir, Ohad},
  booktitle = {Annual Conference on Computational Learning Theory},
  year      = {2018},
  pages     = {297-299},
  doi       = {10.1093/IMAIAI/IAZ007},
  url       = {https://mlanthology.org/colt/2018/golowich2018colt-size/}
}