Katalyst: Boosting Convex Katayusha for Non-Convex Problems with a Large Condition Number

Abstract

An important class of non-convex objectives that has wide applications in machine learning consists of a sum of $n$ smooth functions and a non-smooth convex function. Tremendous studies have been devoted to conquering these problems by leveraging one of the two types of variance reduction techniques, i.e., SVRG-type that computes a full gradient occasionally and SAGA-type that maintains $n$ stochastic gradients at every iteration. In practice, SVRG-type is preferred to SAGA-type due to its potentially less memory costs. An interesting question that has been largely ignored is how to improve the complexity of variance reduction methods for problems with a large condition number that measures the degree to which the objective is close to a convex function. In this paper, we present a simple but non-trivial boosting of a state-of-the-art SVRG-type method for convex problems (namely Katyusha) to enjoy an improved complexity for solving non-convex problems with a large condition number (that is close to a convex function). To the best of our knowledge, its complexity has the best dependence on $n$ and the degree of non-convexity, and also matches that of a recent SAGA-type accelerated stochastic algorithm for a constrained non-convex smooth optimization problem.

Cite

Text

Chen et al. "Katalyst: Boosting Convex Katayusha for Non-Convex Problems with a Large Condition Number." International Conference on Machine Learning, 2019.

Markdown

[Chen et al. "Katalyst: Boosting Convex Katayusha for Non-Convex Problems with a Large Condition Number." International Conference on Machine Learning, 2019.](https://mlanthology.org/icml/2019/chen2019icml-katalyst/)

BibTeX

@inproceedings{chen2019icml-katalyst,
  title     = {{Katalyst: Boosting Convex Katayusha for Non-Convex Problems with a Large Condition Number}},
  author    = {Chen, Zaiyi and Xu, Yi and Hu, Haoyuan and Yang, Tianbao},
  booktitle = {International Conference on Machine Learning},
  year      = {2019},
  pages     = {1102-1111},
  volume    = {97},
  url       = {https://mlanthology.org/icml/2019/chen2019icml-katalyst/}
}