Conditional Gradient Methods via Stochastic Path-Integrated Differential Estimator

Abstract

We propose a class of variance-reduced stochastic conditional gradient methods. By adopting the recent stochastic path-integrated differential estimator technique (SPIDER) of Fang et. al. (2018) for the classical Frank-Wolfe (FW) method, we introduce SPIDER-FW for finite-sum minimization as well as the more general expectation minimization problems. SPIDER-FW enjoys superior complexity guarantees in the non-convex setting, while matching the best known FW variants in the convex case. We also extend our framework a la conditional gradient sliding (CGS) of Lan & Zhou. (2016), and propose SPIDER-CGS.

Cite

Text

Yurtsever et al. "Conditional Gradient Methods via Stochastic Path-Integrated Differential Estimator." International Conference on Machine Learning, 2019.

Markdown

[Yurtsever et al. "Conditional Gradient Methods via Stochastic Path-Integrated Differential Estimator." International Conference on Machine Learning, 2019.](https://mlanthology.org/icml/2019/yurtsever2019icml-conditional/)

BibTeX

@inproceedings{yurtsever2019icml-conditional,
  title     = {{Conditional Gradient Methods via Stochastic Path-Integrated Differential Estimator}},
  author    = {Yurtsever, Alp and Sra, Suvrit and Cevher, Volkan},
  booktitle = {International Conference on Machine Learning},
  year      = {2019},
  pages     = {7282-7291},
  volume    = {97},
  url       = {https://mlanthology.org/icml/2019/yurtsever2019icml-conditional/}
}