Smooth Primal-Dual Coordinate Descent Algorithms for Nonsmooth Convex Optimization

Abstract

We propose a new randomized coordinate descent method for a convex optimization template with broad applications. Our analysis relies on a novel combination of four ideas applied to the primal-dual gap function: smoothing, acceleration, homotopy, and coordinate descent with non-uniform sampling. As a result, our method features the first convergence rate guarantees among the coordinate descent methods, that are the best-known under a variety of common structure assumptions on the template. We provide numerical evidence to support the theoretical results with a comparison to state-of-the-art algorithms.

Cite

Text

Alacaoglu et al. "Smooth Primal-Dual Coordinate Descent Algorithms for Nonsmooth Convex Optimization." Neural Information Processing Systems, 2017.

Markdown

[Alacaoglu et al. "Smooth Primal-Dual Coordinate Descent Algorithms for Nonsmooth Convex Optimization." Neural Information Processing Systems, 2017.](https://mlanthology.org/neurips/2017/alacaoglu2017neurips-smooth/)

BibTeX

@inproceedings{alacaoglu2017neurips-smooth,
  title     = {{Smooth Primal-Dual Coordinate Descent Algorithms for Nonsmooth Convex Optimization}},
  author    = {Alacaoglu, Ahmet and Dinh, Quoc Tran and Fercoq, Olivier and Cevher, Volkan},
  booktitle = {Neural Information Processing Systems},
  year      = {2017},
  pages     = {5852-5861},
  url       = {https://mlanthology.org/neurips/2017/alacaoglu2017neurips-smooth/}
}