Stochastic Optimization with Heavy-Tailed Noise via Accelerated Gradient Clipping

Abstract

In this paper, we propose a new accelerated stochastic first-order method called clipped-SSTM for smooth convex stochastic optimization with heavy-tailed distributed noise in stochastic gradients and derive the first high-probability complexity bounds for this method closing the gap in the theory of stochastic optimization with heavy-tailed noise. Our method is based on a special variant of accelerated Stochastic Gradient Descent (SGD) and clipping of stochastic gradients. We extend our method to the strongly convex case and prove new complexity bounds that outperform state-of-the-art results in this case. Finally, we extend our proof technique and derive the first non-trivial high-probability complexity bounds for SGD with clipping without light-tails assumption on the noise.

Cite

Text

Gorbunov et al. "Stochastic Optimization with Heavy-Tailed Noise via Accelerated Gradient Clipping." Neural Information Processing Systems, 2020.

Markdown

[Gorbunov et al. "Stochastic Optimization with Heavy-Tailed Noise via Accelerated Gradient Clipping." Neural Information Processing Systems, 2020.](https://mlanthology.org/neurips/2020/gorbunov2020neurips-stochastic/)

BibTeX

@inproceedings{gorbunov2020neurips-stochastic,
  title     = {{Stochastic Optimization with Heavy-Tailed Noise via Accelerated Gradient Clipping}},
  author    = {Gorbunov, Eduard and Danilova, Marina and Gasnikov, Alexander},
  booktitle = {Neural Information Processing Systems},
  year      = {2020},
  url       = {https://mlanthology.org/neurips/2020/gorbunov2020neurips-stochastic/}
}