Single Loop Gaussian Homotopy Method for Non-Convex Optimization
Abstract
The Gaussian homotopy (GH) method is a popular approach to finding better stationary points for non-convex optimization problems by gradually reducing a parameter value $t$, which changes the problem to be solved from an almost convex one to the original target one. Existing GH-based methods repeatedly call an iterative optimization solver to find a stationary point every time $t$ is updated, which incurs high computational costs. We propose a novel single loop framework for GH methods (SLGH) that updates the parameter $t$ and the optimization decision variables at the same. Computational complexity analysis is performed on the SLGH algorithm under various situations: either a gradient or gradient-free oracle of a GH function can be obtained for both deterministic and stochastic settings. The convergence rate of SLGH with a tuned hyperparameter becomes consistent with the convergence rate of gradient descent, even though the problem to be solved is gradually changed due to $t$. In numerical experiments, our SLGH algorithms show faster convergence than an existing double loop GH method while outperforming gradient descent-based methods in terms of finding a better solution.
Cite
Text
Iwakiri et al. "Single Loop Gaussian Homotopy Method for Non-Convex Optimization." Neural Information Processing Systems, 2022.Markdown
[Iwakiri et al. "Single Loop Gaussian Homotopy Method for Non-Convex Optimization." Neural Information Processing Systems, 2022.](https://mlanthology.org/neurips/2022/iwakiri2022neurips-single/)BibTeX
@inproceedings{iwakiri2022neurips-single,
title = {{Single Loop Gaussian Homotopy Method for Non-Convex Optimization}},
author = {Iwakiri, Hidenori and Wang, Yuhang and Ito, Shinji and Takeda, Akiko},
booktitle = {Neural Information Processing Systems},
year = {2022},
url = {https://mlanthology.org/neurips/2022/iwakiri2022neurips-single/}
}