Stochastic Gradient Descent for Non-Smooth Optimization: Convergence Results and Optimal Averaging Schemes

Abstract

Stochastic Gradient Descent (SGD) is one of the simplest and most popular stochastic optimization methods. While it has already been theoretically studied for decades, the classical analysis usually required non-trivial smoothness assumptions, which do not apply to many modern applications of SGD with non-smooth objective functions such as support vector machines. In this paper, we investigate the performance of SGD \emphwithout such smoothness assumptions, as well as a running average scheme to convert the SGD iterates to a solution with optimal optimization accuracy. In this framework, we prove that after T rounds, the suboptimality of the \emphlast SGD iterate scales as O(\log(T)/\sqrt{T}) for non-smooth convex objective functions, and O(\log(T)/T) in the non-smooth strongly convex case. To the best of our knowledge, these are the first bounds of this kind, and almost match the minimax-optimal rates obtainable by appropriate averaging schemes. We also propose a new and simple averaging scheme, which not only attains optimal rates, but can also be easily computed on-the-fly (in contrast, the suffix averaging scheme proposed in \citetRakhShaSri12arxiv is not as simple to implement). Finally, we provide some experimental illustrations.

Cite

Text

Shamir and Zhang. "Stochastic Gradient Descent for Non-Smooth Optimization: Convergence Results and Optimal Averaging Schemes." International Conference on Machine Learning, 2013.

Markdown

[Shamir and Zhang. "Stochastic Gradient Descent for Non-Smooth Optimization: Convergence Results and Optimal Averaging Schemes." International Conference on Machine Learning, 2013.](https://mlanthology.org/icml/2013/shamir2013icml-stochastic/)

BibTeX

@inproceedings{shamir2013icml-stochastic,
  title     = {{Stochastic Gradient Descent for Non-Smooth Optimization: Convergence Results and Optimal Averaging Schemes}},
  author    = {Shamir, Ohad and Zhang, Tong},
  booktitle = {International Conference on Machine Learning},
  year      = {2013},
  pages     = {71-79},
  volume    = {28},
  url       = {https://mlanthology.org/icml/2013/shamir2013icml-stochastic/}
}