An Adaptive Accelerated Proximal Gradient Method and Its Homotopy Continuation for Sparse Optimization

Abstract

We first propose an adaptive accelerated proximal gradient(APG) method for minimizing strongly convex composite functions with unknown convexity parameters. This method incorporates a restarting scheme to automatically estimate the strong convexity parameter and achieves a nearly optimal iteration complexity. Then we consider the ℓ1-regularized least-squares (ℓ1-LS) problem in the high-dimensional setting. Although such an objective function is not strongly convex, it has restricted strong convexity over sparse vectors. We exploit this property by combining the adaptive APG method with a homotopy continuation scheme, which generates a sparse solution path towards optimality. This method obtains a global linear rate of convergence and its overall iteration complexity has a weaker dependency on the restricted condition number than previous work.

Cite

Text

Lin and Xiao. "An Adaptive Accelerated Proximal Gradient Method and Its Homotopy Continuation for Sparse Optimization." International Conference on Machine Learning, 2014.

Markdown

[Lin and Xiao. "An Adaptive Accelerated Proximal Gradient Method and Its Homotopy Continuation for Sparse Optimization." International Conference on Machine Learning, 2014.](https://mlanthology.org/icml/2014/lin2014icml-adaptive/)

BibTeX

@inproceedings{lin2014icml-adaptive,
  title     = {{An Adaptive Accelerated Proximal Gradient Method and Its Homotopy Continuation for Sparse Optimization}},
  author    = {Lin, Qihang and Xiao, Lin},
  booktitle = {International Conference on Machine Learning},
  year      = {2014},
  pages     = {73-81},
  volume    = {32},
  url       = {https://mlanthology.org/icml/2014/lin2014icml-adaptive/}
}