Toward Better PAC-Bayes Bounds for Uniformly Stable Algorithms

Abstract

We give sharper bounds for uniformly stable randomized algorithms in a PAC-Bayesian framework, which improve the existing results by up to a factor of $\sqrt{n}$ (ignoring a log factor), where $n$ is the sample size. The key idea is to bound the moment generating function of the generalization gap using concentration of weakly dependent random variables due to Bousquet et al (2020). We introduce an assumption of sub-exponential stability parameter, which allows a general treatment that we instantiate in two applications: stochastic gradient descent and randomized coordinate descent. Our results eliminate the requirement of strong convexity from previous results, and hold for non-smooth convex problems.

Cite

Text

Zhou et al. "Toward Better PAC-Bayes Bounds for Uniformly Stable Algorithms." Neural Information Processing Systems, 2023.

Markdown

[Zhou et al. "Toward Better PAC-Bayes Bounds for Uniformly Stable Algorithms." Neural Information Processing Systems, 2023.](https://mlanthology.org/neurips/2023/zhou2023neurips-better/)

BibTeX

@inproceedings{zhou2023neurips-better,
  title     = {{Toward Better PAC-Bayes Bounds for Uniformly Stable Algorithms}},
  author    = {Zhou, Sijia and Lei, Yunwen and Kaban, Ata},
  booktitle = {Neural Information Processing Systems},
  year      = {2023},
  url       = {https://mlanthology.org/neurips/2023/zhou2023neurips-better/}
}