Fast Zeroth-Order Convex Optimization with Quantum Gradient Methods
Abstract
We study quantum algorithms based on quantum (sub)gradient estimation using noisy function evaluation oracles, and demonstrate the first dimension-independent query complexities (up to poly-logarithmic factors) for zeroth-order convex optimization in both smooth and nonsmooth settings. Interestingly, only using noisy function evaluation oracles, we match the first-order query complexities of classical gradient descent, thereby exhibiting exponential separation between quantum and classical zeroth-order optimization. We then generalize these algorithms to work in non-Euclidean settings by using quantum (sub)gradient estimation to instantiate mirror descent and its variants, including dual averaging and mirror prox. By leveraging a connection between semidefinite programming and eigenvalue optimization, we use our quantum mirror descent method to give a new quantum algorithm for solving semidefinite programs, linear programs, and zero-sum games. We identify a parameter regime in which our zero-sum games algorithm is faster than any existing classical or quantum approach.
Cite
Text
Kim et al. "Fast Zeroth-Order Convex Optimization with Quantum Gradient Methods." Advances in Neural Information Processing Systems, 2025.Markdown
[Kim et al. "Fast Zeroth-Order Convex Optimization with Quantum Gradient Methods." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/kim2025neurips-fast/)BibTeX
@inproceedings{kim2025neurips-fast,
title = {{Fast Zeroth-Order Convex Optimization with Quantum Gradient Methods}},
author = {Kim, Junhyung Lyle and Augustino, Brandon and Herman, Dylan and Fontana, Enrico and Watkins, Jacob and Pistoia, Marco and Chakrabarti, Shouvanik},
booktitle = {Advances in Neural Information Processing Systems},
year = {2025},
url = {https://mlanthology.org/neurips/2025/kim2025neurips-fast/}
}