SDCA Without Duality, Regularization, and Individual Convexity

Abstract

Stochastic Dual Coordinate Ascent is a popular method for solving regularized loss minimization for the case of convex losses. We describe variants of SDCA that do not require explicit regularization and do not rely on duality. We prove linear convergence rates even if individual loss functions are non-convex, as long as the expected loss is strongly convex.

Cite

Text

Shalev-Shwartz. "SDCA Without Duality, Regularization, and Individual Convexity." International Conference on Machine Learning, 2016.

Markdown

[Shalev-Shwartz. "SDCA Without Duality, Regularization, and Individual Convexity." International Conference on Machine Learning, 2016.](https://mlanthology.org/icml/2016/shalevshwartz2016icml-sdca/)

BibTeX

@inproceedings{shalevshwartz2016icml-sdca,
  title     = {{SDCA Without Duality, Regularization, and Individual Convexity}},
  author    = {Shalev-Shwartz, Shai},
  booktitle = {International Conference on Machine Learning},
  year      = {2016},
  pages     = {747-754},
  volume    = {48},
  url       = {https://mlanthology.org/icml/2016/shalevshwartz2016icml-sdca/}
}