Stability of Stochastic Gradient Descent on Nonsmooth Convex Losses
Abstract
Uniform stability is a notion of algorithmic stability that bounds the worst case change in the model output by the algorithm when a single data point in the dataset is replaced. An influential work of Hardt et al. [2016] provides strong upper bounds on the uniform stability of the stochastic gradient descent (SGD) algorithm on sufficiently smooth convex losses. These results led to important progress in understanding of the generalization properties of SGD and several applications to differentially private convex optimization for smooth losses.
Cite
Text
Bassily et al. "Stability of Stochastic Gradient Descent on Nonsmooth Convex Losses." Neural Information Processing Systems, 2020.Markdown
[Bassily et al. "Stability of Stochastic Gradient Descent on Nonsmooth Convex Losses." Neural Information Processing Systems, 2020.](https://mlanthology.org/neurips/2020/bassily2020neurips-stability/)BibTeX
@inproceedings{bassily2020neurips-stability,
title = {{Stability of Stochastic Gradient Descent on Nonsmooth Convex Losses}},
author = {Bassily, Raef and Feldman, Vitaly and Guzmán, Cristóbal and Talwar, Kunal},
booktitle = {Neural Information Processing Systems},
year = {2020},
url = {https://mlanthology.org/neurips/2020/bassily2020neurips-stability/}
}