Sample Complexity for Quadratic Bandits: Hessian Dependent Bounds and Optimal Algorithms
Abstract
In stochastic zeroth-order optimization, a problem of practical relevance is understanding how to fully exploit the local geometry of the underlying objective function. We consider a fundamental setting in which the objective function is quadratic, and provide the first tight characterization of the optimal Hessian-dependent sample complexity. Our contribution is twofold. First, from an information-theoretic point of view, we prove tight lower bounds on Hessian-dependent complexities by introducing a concept called \emph{energy allocation}, which captures the interaction between the searching algorithm and the geometry of objective functions. A matching upper bound is obtained by solving the optimal energy spectrum. Then, algorithmically, we show the existence of a Hessian-independent algorithm that universally achieves the asymptotic optimal sample complexities for all Hessian instances. The optimal sample complexities achieved by our algorithm remain valid for heavy-tailed noise distributions, which are enabled by a truncation method.
Cite
Text
Yu et al. "Sample Complexity for Quadratic Bandits: Hessian Dependent Bounds and Optimal Algorithms." Neural Information Processing Systems, 2023.Markdown
[Yu et al. "Sample Complexity for Quadratic Bandits: Hessian Dependent Bounds and Optimal Algorithms." Neural Information Processing Systems, 2023.](https://mlanthology.org/neurips/2023/yu2023neurips-sample/)BibTeX
@inproceedings{yu2023neurips-sample,
title = {{Sample Complexity for Quadratic Bandits: Hessian Dependent Bounds and Optimal Algorithms}},
author = {Yu, Qian and Wang, Yining and Huang, Baihe and Lei, Qi and Lee, Jason},
booktitle = {Neural Information Processing Systems},
year = {2023},
url = {https://mlanthology.org/neurips/2023/yu2023neurips-sample/}
}