Generalized Majorization-Minimization for Non-Convex Optimization
Abstract
Majorization-Minimization (MM) algorithms optimize an objective function by iteratively minimizing its majorizing surrogate and offer attractively fast convergence rate for convex problems. However, their convergence behaviors for non-convex problems remain unclear. In this paper, we propose a novel MM surrogate function from strictly upper bounding the objective to bounding the objective in expectation. With this generalized surrogate conception, we develop a new optimization algorithm, termed SPI-MM, that leverages the recent proposed SPIDER for more efficient non-convex optimization. We prove that for finite-sum problems, the SPI-MM algorithm converges to an stationary point within deterministic and lower stochastic gradient complexity. To our best knowledge, this work gives the first non-asymptotic convergence analysis for MM-alike algorithms in general non-convex optimization. Extensive empirical studies on non-convex logistic regression and sparse PCA demonstrate the advantageous efficiency of the proposed algorithm and validate our theoretical results.
Cite
Text
Zhang et al. "Generalized Majorization-Minimization for Non-Convex Optimization." International Joint Conference on Artificial Intelligence, 2019. doi:10.24963/IJCAI.2019/591Markdown
[Zhang et al. "Generalized Majorization-Minimization for Non-Convex Optimization." International Joint Conference on Artificial Intelligence, 2019.](https://mlanthology.org/ijcai/2019/zhang2019ijcai-generalized/) doi:10.24963/IJCAI.2019/591BibTeX
@inproceedings{zhang2019ijcai-generalized,
title = {{Generalized Majorization-Minimization for Non-Convex Optimization}},
author = {Zhang, Hu and Zhou, Pan and Yang, Yi and Feng, Jiashi},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2019},
pages = {4257-4263},
doi = {10.24963/IJCAI.2019/591},
url = {https://mlanthology.org/ijcai/2019/zhang2019ijcai-generalized/}
}