Quantum Speedups for Stochastic Optimization
Abstract
We consider the problem of minimizing a continuous function given given access to a natural quantum generalization of a stochastic gradient oracle. We provide two new methods for the special case of minimizing a Lipschitz convex function. Each method obtains a dimension versus accuracy trade-off which is provably unachievable classically and we prove that one method is asymptotically optimal in low-dimensional settings. Additionally, we provide quantum algorithms for computing a critical point of a smooth non-convex function at rates not known to be achievable classically. To obtain these results we build upon the quantum multivariate mean estimation result of Cornelissen et al. and provide a general quantum variance reduction technique of independent interest.
Cite
Text
Sidford and Zhang. "Quantum Speedups for Stochastic Optimization." Neural Information Processing Systems, 2023.Markdown
[Sidford and Zhang. "Quantum Speedups for Stochastic Optimization." Neural Information Processing Systems, 2023.](https://mlanthology.org/neurips/2023/sidford2023neurips-quantum/)BibTeX
@inproceedings{sidford2023neurips-quantum,
title = {{Quantum Speedups for Stochastic Optimization}},
author = {Sidford, Aaron and Zhang, Chenyi},
booktitle = {Neural Information Processing Systems},
year = {2023},
url = {https://mlanthology.org/neurips/2023/sidford2023neurips-quantum/}
}