Efficient Projection-Free Online Methods with Stochastic Recursive Gradient
Abstract
This paper focuses on projection-free methods for solving smooth Online Convex Optimization (OCO) problems. Existing projection-free methods either achieve suboptimal regret bounds or have high per-round computational costs. To fill this gap, two efficient projection-free online methods called ORGFW and MORGFW are proposed for solving stochastic and adversarial OCO problems, respectively. By employing a recursive gradient estimator, our methods achieve optimal regret bounds (up to a logarithmic factor) while possessing low per-round computational costs. Experimental results demonstrate the efficiency of the proposed methods compared to state-of-the-arts.
Cite
Text
Xie et al. "Efficient Projection-Free Online Methods with Stochastic Recursive Gradient." AAAI Conference on Artificial Intelligence, 2020. doi:10.1609/AAAI.V34I04.6116Markdown
[Xie et al. "Efficient Projection-Free Online Methods with Stochastic Recursive Gradient." AAAI Conference on Artificial Intelligence, 2020.](https://mlanthology.org/aaai/2020/xie2020aaai-efficient/) doi:10.1609/AAAI.V34I04.6116BibTeX
@inproceedings{xie2020aaai-efficient,
title = {{Efficient Projection-Free Online Methods with Stochastic Recursive Gradient}},
author = {Xie, Jiahao and Shen, Zebang and Zhang, Chao and Wang, Boyu and Qian, Hui},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2020},
pages = {6446-6453},
doi = {10.1609/AAAI.V34I04.6116},
url = {https://mlanthology.org/aaai/2020/xie2020aaai-efficient/}
}