Efficient Sign-Based Optimization: Accelerating Convergence via Variance Reduction

Abstract

Sign stochastic gradient descent (signSGD) is a communication-efficient method that transmits only the sign of stochastic gradients for parameter updating. Existing literature has demonstrated that signSGD can achieve a convergence rate of $\mathcal{O}(d^{1/2}T^{-1/4})$, where $d$ represents the dimension and $T$ is the iteration number. In this paper, we improve this convergence rate to $\mathcal{O}(d^{1/2}T^{-1/3})$ by introducing the Sign-based Stochastic Variance Reduction (SSVR) method, which employs variance reduction estimators to track gradients and leverages their signs to update. For finite-sum problems, our method can be further enhanced to achieve a convergence rate of $\mathcal{O}(m^{1/4}d^{1/2}T^{-1/2})$, where $m$ denotes the number of component functions. Furthermore, we investigate the heterogeneous majority vote in distributed settings and introduce two novel algorithms that attain improved convergence rates of $\mathcal{O}(d^{1/2}T^{-1/2} + dn^{-1/2})$ and $\mathcal{O}(d^{1/4}T^{-1/4})$ respectively, outperforming the previous results of $\mathcal{O}(dT^{-1/4} + dn^{-1/2})$ and $\mathcal{O}(d^{3/8}T^{-1/8})$, where $n$ represents the number of nodes. Numerical experiments across different tasks validate the effectiveness of our proposed methods.

Cite

Text

Jiang et al. "Efficient Sign-Based Optimization: Accelerating Convergence via Variance Reduction." Neural Information Processing Systems, 2024. doi:10.52202/079017-1068

Markdown

[Jiang et al. "Efficient Sign-Based Optimization: Accelerating Convergence via Variance Reduction." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/jiang2024neurips-efficient/) doi:10.52202/079017-1068

BibTeX

@inproceedings{jiang2024neurips-efficient,
  title     = {{Efficient Sign-Based Optimization: Accelerating Convergence via Variance Reduction}},
  author    = {Jiang, Wei and Yang, Sifan and Yang, Wenhao and Zhang, Lijun},
  booktitle = {Neural Information Processing Systems},
  year      = {2024},
  doi       = {10.52202/079017-1068},
  url       = {https://mlanthology.org/neurips/2024/jiang2024neurips-efficient/}
}