Accelerated Incremental Gradient Descent Using Momentum Acceleration with Scaling Factor

Abstract

Recently, research on variance reduced incremental gradient descent methods (e.g., SAGA) has made exciting progress (e.g., linear convergence for strongly convex (SC) problems). However, existing accelerated methods (e.g., point-SAGA) suffer from drawbacks such as inflexibility. In this paper, we design a novel and simple momentum to accelerate the classical SAGA algorithm, and propose a direct accelerated incremental gradient descent algorithm. In particular, our theoretical result shows that our algorithm attains a best known oracle complexity for strongly convex problems and an improved convergence rate for the case of n>=L/\mu. We also give experimental results justifying our theoretical results and showing the effectiveness of our algorithm.

Cite

Text

Liu et al. "Accelerated Incremental Gradient Descent Using Momentum Acceleration with Scaling Factor." International Joint Conference on Artificial Intelligence, 2019. doi:10.24963/IJCAI.2019/422

Markdown

[Liu et al. "Accelerated Incremental Gradient Descent Using Momentum Acceleration with Scaling Factor." International Joint Conference on Artificial Intelligence, 2019.](https://mlanthology.org/ijcai/2019/liu2019ijcai-accelerated/) doi:10.24963/IJCAI.2019/422

BibTeX

@inproceedings{liu2019ijcai-accelerated,
  title     = {{Accelerated Incremental Gradient Descent Using Momentum Acceleration with Scaling Factor}},
  author    = {Liu, Yuanyuan and Shang, Fanhua and Jiao, Licheng},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2019},
  pages     = {3045-3051},
  doi       = {10.24963/IJCAI.2019/422},
  url       = {https://mlanthology.org/ijcai/2019/liu2019ijcai-accelerated/}
}