O(logT) Projections for Stochastic Optimization of Smooth and Strongly Convex Functions
Abstract
Traditional algorithms for stochastic optimization require projecting the solution at each iteration into a given domain to ensure its feasibility. When facing complex domains, such as the positive semidefinite cone, the projection operation can be expensive, leading to a high computational cost per iteration. In this paper, we present a novel algorithm that aims to reduce the number of projections for stochastic optimization. The proposed algorithm combines the strength of several recent developments in stochastic optimization, including mini-batches, extra-gradient, and epoch gradient descent, in order to effectively explore the smoothness and strong convexity. We show, both in expectation and with a high probability, that when the objective function is both smooth and strongly convex, the proposed algorithm achieves the optimal O(1/T) rate of convergence with only O(logT) projections. Our empirical study verifies the theoretical result.
Cite
Text
Zhang et al. "O(logT) Projections for Stochastic Optimization of Smooth and Strongly Convex Functions." International Conference on Machine Learning, 2013.Markdown
[Zhang et al. "O(logT) Projections for Stochastic Optimization of Smooth and Strongly Convex Functions." International Conference on Machine Learning, 2013.](https://mlanthology.org/icml/2013/zhang2013icml-logt/)BibTeX
@inproceedings{zhang2013icml-logt,
title = {{O(logT) Projections for Stochastic Optimization of Smooth and Strongly Convex Functions}},
author = {Zhang, Lijun and Yang, Tianbao and Jin, Rong and He, Xiaofei},
booktitle = {International Conference on Machine Learning},
year = {2013},
pages = {1121-1129},
volume = {28},
url = {https://mlanthology.org/icml/2013/zhang2013icml-logt/}
}