Quantum Algorithms for Non-Smooth Non-Convex Optimization
Abstract
This paper considers the problem for finding the $(\delta,\epsilon)$-Goldstein stationary point of Lipschitz continuous objective, which is a rich function class to cover a great number of important applications. We construct a novel zeroth-order quantum estimator for the gradient of the smoothed surrogate. Based on such estimator, we propose a novel quantum algorithm that achieves a query complexity of $\tilde{\mathcal{O}}(d^{3/2}\delta^{-1}\epsilon^{-3})$ on the stochastic function value oracle, where $d$ is the dimension of the problem. We also enhance the query complexity to $\tilde{\mathcal{O}}(d^{3/2}\delta^{-1}\epsilon^{-7/3})$ by introducing a variance reduction variant. Our findings demonstrate the clear advantages of utilizing quantum techniques for non-convex non-smooth optimization, as they outperform the optimal classical methods on the dependency of $\epsilon$ by a factor of $\epsilon^{-2/3}$.
Cite
Text
Liu et al. "Quantum Algorithms for Non-Smooth Non-Convex Optimization." Neural Information Processing Systems, 2024. doi:10.52202/079017-1112Markdown
[Liu et al. "Quantum Algorithms for Non-Smooth Non-Convex Optimization." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/liu2024neurips-quantum/) doi:10.52202/079017-1112BibTeX
@inproceedings{liu2024neurips-quantum,
title = {{Quantum Algorithms for Non-Smooth Non-Convex Optimization}},
author = {Liu, Chengchang and Guan, Chaowen and He, Jianhao and Lui, John C.S.},
booktitle = {Neural Information Processing Systems},
year = {2024},
doi = {10.52202/079017-1112},
url = {https://mlanthology.org/neurips/2024/liu2024neurips-quantum/}
}