Efficient Inexact Proximal Gradient Algorithm for Nonconvex Problems
Abstract
While proximal gradient algorithm is originally designed for convex optimization, several variants have been recently proposed for nonconvex problems. Among them, nmAPG [Li and Lin, 2015] is the state-of-art. However, it is inefficient when the proximal step does not have closed-form solution, or such solution exists but is expensive, as it requires more than one proximal steps to be exactly solved in each iteration. In this paper, we propose an efficient accelerate proximal gradient (niAPG) algorithm for nonconvex problems. In each iteration, it requires only one inexact (less expensive) proximal step. Convergence to a critical point is still guaranteed, and a O(1/k) convergence rate is derived. Experiments on image inpainting and matrix completion problems demonstrate that the proposed algorithm has comparable performance as the state-of-the-art, but is much faster.
Cite
Text
Yao et al. "Efficient Inexact Proximal Gradient Algorithm for Nonconvex Problems." International Joint Conference on Artificial Intelligence, 2017. doi:10.24963/IJCAI.2017/462Markdown
[Yao et al. "Efficient Inexact Proximal Gradient Algorithm for Nonconvex Problems." International Joint Conference on Artificial Intelligence, 2017.](https://mlanthology.org/ijcai/2017/yao2017ijcai-efficient/) doi:10.24963/IJCAI.2017/462BibTeX
@inproceedings{yao2017ijcai-efficient,
title = {{Efficient Inexact Proximal Gradient Algorithm for Nonconvex Problems}},
author = {Yao, Quanming and Kwok, James T. and Gao, Fei and Chen, Wei and Liu, Tie-Yan},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2017},
pages = {3308-3314},
doi = {10.24963/IJCAI.2017/462},
url = {https://mlanthology.org/ijcai/2017/yao2017ijcai-efficient/}
}