Stochastic Primal-Dual Method for Empirical Risk Minimization with O(1) Per-Iteration Complexity
Abstract
Regularized empirical risk minimization problem with linear predictor appears frequently in machine learning. In this paper, we propose a new stochastic primal-dual method to solve this class of problems. Different from existing methods, our proposed methods only require O(1) operations in each iteration. We also develop a variance-reduction variant of the algorithm that converges linearly. Numerical experiments suggest that our methods are faster than existing ones such as proximal SGD, SVRG and SAGA on high-dimensional problems.
Cite
Text
Tan et al. "Stochastic Primal-Dual Method for Empirical Risk Minimization with O(1) Per-Iteration Complexity." Neural Information Processing Systems, 2018.Markdown
[Tan et al. "Stochastic Primal-Dual Method for Empirical Risk Minimization with O(1) Per-Iteration Complexity." Neural Information Processing Systems, 2018.](https://mlanthology.org/neurips/2018/tan2018neurips-stochastic/)BibTeX
@inproceedings{tan2018neurips-stochastic,
title = {{Stochastic Primal-Dual Method for Empirical Risk Minimization with O(1) Per-Iteration Complexity}},
author = {Tan, Conghui and Zhang, Tong and Ma, Shiqian and Liu, Ji},
booktitle = {Neural Information Processing Systems},
year = {2018},
pages = {8366-8375},
url = {https://mlanthology.org/neurips/2018/tan2018neurips-stochastic/}
}