Faster Gradient-Free Algorithms for Nonsmooth Nonconvex Stochastic Optimization
Abstract
We consider the optimization problem of the form $\min_{x \in \mathbb{R}^d} f(x) \triangleq \mathbb{E}[F(x;\xi)]$ , where the component $F(x;\xi)$ is $L$-mean-squared Lipschitz but possibly nonconvex and nonsmooth.The recently proposed gradient-free method requires at most $\mathcal{O}( L^4 d^{3/2} \epsilon^{-4} + \Delta L^3 d^{3/2} \delta^{-1} \epsilon^{-4})$ stochastic zeroth-order oracle complexity to find a $(\delta,\epsilon)$-Goldstein stationary point of objective function, where $\Delta = f(x_0) - \inf_{x \in \mathbb{R}^d} f(x)$ and $x_0$ is the initial point of the algorithm. This paper proposes a more efficient algorithm using stochastic recursive gradient estimators, which improves the complexity to $\mathcal{O}(L^3 d^{3/2} \epsilon^{-3}+ \Delta L^2 d^{3/2} \delta^{-1} \epsilon^{-3})$.
Cite
Text
Chen et al. "Faster Gradient-Free Algorithms for Nonsmooth Nonconvex Stochastic Optimization." International Conference on Machine Learning, 2023.Markdown
[Chen et al. "Faster Gradient-Free Algorithms for Nonsmooth Nonconvex Stochastic Optimization." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/chen2023icml-faster/)BibTeX
@inproceedings{chen2023icml-faster,
title = {{Faster Gradient-Free Algorithms for Nonsmooth Nonconvex Stochastic Optimization}},
author = {Chen, Lesi and Xu, Jing and Luo, Luo},
booktitle = {International Conference on Machine Learning},
year = {2023},
pages = {5219-5233},
volume = {202},
url = {https://mlanthology.org/icml/2023/chen2023icml-faster/}
}