ASVRG: Accelerated Proximal SVRG
Abstract
This paper proposes an accelerated proximal stochastic variance reduced gradient (ASVRG) method, in which we design a simple and effective momentum acceleration trick. Unlike most existing accelerated stochastic variance reduction methods such as Katyusha, ASVRG has only one additional variable and one momentum parameter. Thus, ASVRG is much simpler than those methods, and has much lower per-iteration complexity. We prove that ASVRG achieves the best known oracle complexities for both strongly convex and non-strongly convex objectives. In addition, we extend ASVRG to mini-batch and non-smooth settings. We also empirically verify our theoretical results and show that the performance of ASVRG is comparable with, and sometimes even better than that of the state-of-the-art stochastic methods.
Cite
Text
Shang et al. "ASVRG: Accelerated Proximal SVRG." Proceedings of The 10th Asian Conference on Machine Learning, 2018.Markdown
[Shang et al. "ASVRG: Accelerated Proximal SVRG." Proceedings of The 10th Asian Conference on Machine Learning, 2018.](https://mlanthology.org/acml/2018/shang2018acml-asvrg/)BibTeX
@inproceedings{shang2018acml-asvrg,
title = {{ASVRG: Accelerated Proximal SVRG}},
author = {Shang, Fanhua and Jiao, Licheng and Zhou, Kaiwen and Cheng, James and Ren, Yan and Jin, Yufei},
booktitle = {Proceedings of The 10th Asian Conference on Machine Learning},
year = {2018},
pages = {815-830},
volume = {95},
url = {https://mlanthology.org/acml/2018/shang2018acml-asvrg/}
}