Asynchronous Stochastic Proximal Optimization Algorithms with Variance Reduction

Abstract

Regularized empirical risk minimization (R-ERM) is an important branch of machine learning, since it constrains the capacity of the hypothesis space and guarantees the generalization ability of the learning algorithm. Two classic proximal optimization algorithms, i.e., proximal stochastic gradient descent (ProxSGD) and proximal stochastic coordinate descent (ProxSCD) have been widely used to solve the R-ERM problem. Recently, variance reduction technique was proposed to improve ProxSGD and ProxSCD, and the corresponding ProxSVRG and ProxSVRCD have better convergence rate. These proximal algorithms with variance reduction technique have also achieved great success in applications at small and moderate scales. However, in order to solve large-scale R-ERM problems and make more practical impacts, the parallel versions of these algorithms are sorely needed. In this paper, we propose asynchronous ProxSVRG (Async-ProxSVRG) and asynchronous ProxSVRCD (Async-ProxSVRCD) algorithms, and prove that Async-ProxSVRG can achieve near linear speedup when the training data is sparse, while Async-ProxSVRCD can achieve near linear speedup regardless of the sparse condition, as long as the number of block partitions are appropriately set. We have conducted experiments on a regularized logistic regression task. The results verified our theoretical findings and demonstrated the practical efficiency of the asynchronous stochastic proximal algorithms with variance reduction.

Cite

Text

Meng et al. "Asynchronous Stochastic Proximal Optimization Algorithms with Variance Reduction." AAAI Conference on Artificial Intelligence, 2017. doi:10.1609/AAAI.V31I1.10910

Markdown

[Meng et al. "Asynchronous Stochastic Proximal Optimization Algorithms with Variance Reduction." AAAI Conference on Artificial Intelligence, 2017.](https://mlanthology.org/aaai/2017/meng2017aaai-asynchronous/) doi:10.1609/AAAI.V31I1.10910

BibTeX

@inproceedings{meng2017aaai-asynchronous,
  title     = {{Asynchronous Stochastic Proximal Optimization Algorithms with Variance Reduction}},
  author    = {Meng, Qi and Chen, Wei and Yu, Jingcheng and Wang, Taifeng and Ma, Zhiming and Liu, Tie-Yan},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2017},
  pages     = {2329-2335},
  doi       = {10.1609/AAAI.V31I1.10910},
  url       = {https://mlanthology.org/aaai/2017/meng2017aaai-asynchronous/}
}