A Nonconvex Proximal Splitting Algorithm Under Moreau-Yosida Regularization
Abstract
We tackle composite optimization problems whose objectives are (highly) nonconvex and nonsmooth. Classical nonconvex proximal splitting algorithms, such as nonconvex ADMM, suffer from a lack of convergence for such a problem class. In this work, we consider a Moreau-Yosida regularized variant of the original model and propose a novel multiblock primal-dual algorithm on the resulting lifted problem. We provide a complete convergence analysis of our algorithm, and identify respective optimality qualifications under which stationarity of the regularized problem is retrieved at convergence. Numerically, we demonstrate the relevance of our model and the efficiency of our algorithm on robust regression as well as joint variable selection and transductive learning.
Cite
Text
Laude et al. "A Nonconvex Proximal Splitting Algorithm Under Moreau-Yosida Regularization." International Conference on Artificial Intelligence and Statistics, 2018.Markdown
[Laude et al. "A Nonconvex Proximal Splitting Algorithm Under Moreau-Yosida Regularization." International Conference on Artificial Intelligence and Statistics, 2018.](https://mlanthology.org/aistats/2018/laude2018aistats-nonconvex/)BibTeX
@inproceedings{laude2018aistats-nonconvex,
title = {{A Nonconvex Proximal Splitting Algorithm Under Moreau-Yosida Regularization}},
author = {Laude, Emanuel and Wu, Tao and Cremers, Daniel},
booktitle = {International Conference on Artificial Intelligence and Statistics},
year = {2018},
pages = {491-499},
url = {https://mlanthology.org/aistats/2018/laude2018aistats-nonconvex/}
}