Global Optimization with Parametric Function Approximation

Abstract

We consider the problem of global optimization with noisy zeroth order oracles — a well-motivated problem useful for various applications ranging from hyper-parameter tuning for deep learning to new material design. Existing work relies on Gaussian processes or other non-parametric family, which suffers from the curse of dimensionality. In this paper, we propose a new algorithm GO-UCB that leverages a parametric family of functions (e.g., neural networks) instead. Under a realizable assumption and a few other mild geometric conditions, we show that GO-UCB achieves a cumulative regret of $\tilde{O}(\sqrt{T})$ where $T$ is the time horizon. At the core of GO-UCB is a carefully designed uncertainty set over parameters based on gradients that allows optimistic exploration. Synthetic and real-world experiments illustrate GO-UCB works better than popular Bayesian optimization approaches, even if the model is misspecified.

Cite

Text

Liu and Wang. "Global Optimization with Parametric Function Approximation." International Conference on Machine Learning, 2023.

Markdown

[Liu and Wang. "Global Optimization with Parametric Function Approximation." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/liu2023icml-global/)

BibTeX

@inproceedings{liu2023icml-global,
  title     = {{Global Optimization with Parametric Function Approximation}},
  author    = {Liu, Chong and Wang, Yu-Xiang},
  booktitle = {International Conference on Machine Learning},
  year      = {2023},
  pages     = {22113-22136},
  volume    = {202},
  url       = {https://mlanthology.org/icml/2023/liu2023icml-global/}
}