Towards Memory-Friendly Deterministic Incremental Gradient Method
Abstract
Incremental Gradient (IG) methods are classical strategies in solving finite sum minimization problems. Deterministic IG methods are particularly favorable in handling massive scale problem due to its memory-friendly data access pattern. In this paper, we propose a new deterministic variant of the IG method SVRG that blends a periodically updated full gradient with a component function gradient selected in a cyclic order. Our method uses only $O(1)$ extra gradient storage without compromising the linear convergence. Empirical results demonstrate that the proposed method is advantageous over existing incremental gradient algorithms, especially on problems that does not fit into physical memory.
Cite
Text
Xie et al. "Towards Memory-Friendly Deterministic Incremental Gradient Method." International Conference on Artificial Intelligence and Statistics, 2018.Markdown
[Xie et al. "Towards Memory-Friendly Deterministic Incremental Gradient Method." International Conference on Artificial Intelligence and Statistics, 2018.](https://mlanthology.org/aistats/2018/xie2018aistats-memory/)BibTeX
@inproceedings{xie2018aistats-memory,
title = {{Towards Memory-Friendly Deterministic Incremental Gradient Method}},
author = {Xie, Jiahao and Qian, Hui and Shen, Zebang and Zhang, Chao},
booktitle = {International Conference on Artificial Intelligence and Statistics},
year = {2018},
pages = {1147-1156},
url = {https://mlanthology.org/aistats/2018/xie2018aistats-memory/}
}