Iteration Complexity of Feasible Descent Methods for Convex Optimization

Abstract

In many machine learning problems such as the dual form of SVM, the objective function to be minimized is convex but not strongly convex. This fact causes difficulties in obtaining the complexity of some commonly used optimization algorithms. In this paper, we proved the global linear convergence on a wide range of algorithms when they are applied to some non-strongly convex problems. In particular, we are the first to prove $O(\log(1/\epsilon))$ time complexity of cyclic coordinate descent methods on dual problems of support vector classification and regression.

Cite

Text

Wang and Lin. "Iteration Complexity of Feasible Descent Methods for Convex Optimization." Journal of Machine Learning Research, 2014.

Markdown

[Wang and Lin. "Iteration Complexity of Feasible Descent Methods for Convex Optimization." Journal of Machine Learning Research, 2014.](https://mlanthology.org/jmlr/2014/wang2014jmlr-iteration/)

BibTeX

@article{wang2014jmlr-iteration,
  title     = {{Iteration Complexity of Feasible Descent Methods for Convex Optimization}},
  author    = {Wang, Po-Wei and Lin, Chih-Jen},
  journal   = {Journal of Machine Learning Research},
  year      = {2014},
  pages     = {1523-1548},
  volume    = {15},
  url       = {https://mlanthology.org/jmlr/2014/wang2014jmlr-iteration/}
}