A Unified Analysis of Stochastic Momentum Methods for Deep Learning
Abstract
Stochastic momentum methods have been widely adopted in training deep neural networks. However, their theoretical analysis of convergence of the training objective and the generalization error for prediction is still under-explored. This paper aims to bridge the gap between practice and theory by analyzing the stochastic gradient (SG) method, and the stochastic momentum methods including two famous variants, i.e., the stochastic heavy-ball (SHB) method and the stochastic variant of Nesterov?s accelerated gradient (SNAG) method. We propose a framework that unifies the three variants. We then derive the convergence rates of the norm of gradient for the non-convex optimization problem, and analyze the generalization performance through the uniform stability approach. Particularly, the convergence analysis of the training objective exhibits that SHB and SNAG have no advantage over SG. However, the stability analysis shows that the momentum term can improve the stability of the learned model and hence improve the generalization performance. These theoretical insights verify the common wisdom and are also corroborated by our empirical analysis on deep learning.
Cite
Text
Yan et al. "A Unified Analysis of Stochastic Momentum Methods for Deep Learning." International Joint Conference on Artificial Intelligence, 2018. doi:10.24963/IJCAI.2018/410Markdown
[Yan et al. "A Unified Analysis of Stochastic Momentum Methods for Deep Learning." International Joint Conference on Artificial Intelligence, 2018.](https://mlanthology.org/ijcai/2018/yan2018ijcai-unified/) doi:10.24963/IJCAI.2018/410BibTeX
@inproceedings{yan2018ijcai-unified,
title = {{A Unified Analysis of Stochastic Momentum Methods for Deep Learning}},
author = {Yan, Yan and Yang, Tianbao and Li, Zhe and Lin, Qihang and Yang, Yi},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2018},
pages = {2955-2961},
doi = {10.24963/IJCAI.2018/410},
url = {https://mlanthology.org/ijcai/2018/yan2018ijcai-unified/}
}