REST-Katyusha: Exploiting the Solution's Structure via Scheduled Restart Schemes

Abstract

We propose a structure-adaptive variant of the state-of-the-art stochastic variance-reduced gradient algorithm Katyusha for regularized empirical risk minimization. The proposed method is able to exploit the intrinsic low-dimensional structure of the solution, such as sparsity or low rank which is enforced by a non-smooth regularization, to achieve even faster convergence rate. This provable algorithmic improvement is done by restarting the Katyusha algorithm according to restricted strong-convexity constants. We demonstrate the effectiveness of our approach via numerical experiments.

Cite

Text

Tang et al. "REST-Katyusha: Exploiting the Solution's Structure via Scheduled Restart Schemes." Neural Information Processing Systems, 2018.

Markdown

[Tang et al. "REST-Katyusha: Exploiting the Solution's Structure via Scheduled Restart Schemes." Neural Information Processing Systems, 2018.](https://mlanthology.org/neurips/2018/tang2018neurips-restkatyusha/)

BibTeX

@inproceedings{tang2018neurips-restkatyusha,
  title     = {{REST-Katyusha: Exploiting the Solution's Structure via Scheduled Restart Schemes}},
  author    = {Tang, Junqi and Golbabaee, Mohammad and Bach, Francis and Davies, Mike E},
  booktitle = {Neural Information Processing Systems},
  year      = {2018},
  pages     = {429-440},
  url       = {https://mlanthology.org/neurips/2018/tang2018neurips-restkatyusha/}
}