A Multi-Step Inertial Forward-Backward Splitting Method for Non-Convex Optimization

Abstract

In this paper, we propose a multi-step inertial Forward--Backward splitting algorithm for minimizing the sum of two non-necessarily convex functions, one of which is proper lower semi-continuous while the other is differentiable with a Lipschitz continuous gradient. We first prove global convergence of the scheme with the help of the Kurdyka–Łojasiewicz property. Then, when the non-smooth part is also partly smooth relative to a smooth submanifold, we establish finite identification of the latter and provide sharp local linear convergence analysis. The proposed method is illustrated on a few problems arising from statistics and machine learning.

Cite

Text

Liang et al. "A Multi-Step Inertial Forward-Backward Splitting Method for Non-Convex Optimization." Neural Information Processing Systems, 2016.

Markdown

[Liang et al. "A Multi-Step Inertial Forward-Backward Splitting Method for Non-Convex Optimization." Neural Information Processing Systems, 2016.](https://mlanthology.org/neurips/2016/liang2016neurips-multistep/)

BibTeX

@inproceedings{liang2016neurips-multistep,
  title     = {{A Multi-Step Inertial Forward-Backward Splitting Method for Non-Convex Optimization}},
  author    = {Liang, Jingwei and Fadili, Jalal and Peyré, Gabriel},
  booktitle = {Neural Information Processing Systems},
  year      = {2016},
  pages     = {4035-4043},
  url       = {https://mlanthology.org/neurips/2016/liang2016neurips-multistep/}
}