Bayesian Optimistic Optimisation with Exponentially Decaying Regret
Abstract
Bayesian optimisation (BO) is a well known algorithm for finding the global optimum of expensive, black-box functions. The current practical BO algorithms have regret bounds ranging from $\mathcal{O}(\frac{logN}{\sqrt{N}})$ to $\mathcal O(e^{-\sqrt{N}})$, where $N$ is the number of evaluations. This paper explores the possibility of improving the regret bound in the noise-free setting by intertwining concepts from BO and optimistic optimisation methods which are based on partitioning the search space. We propose the BOO algorithm, a first practical approach which can achieve an exponential regret bound with order $\mathcal O(N^{-\sqrt{N}})$ under the assumption that the objective function is sampled from a Gaussian process with a Matérn kernel with smoothness parameter $\nu > 4 +\frac{D}{2}$, where $D$ is the number of dimensions. We perform experiments on optimisation of various synthetic functions and machine learning hyperparameter tuning tasks and show that our algorithm outperforms baselines.
Cite
Text
Tran-The et al. "Bayesian Optimistic Optimisation with Exponentially Decaying Regret." International Conference on Machine Learning, 2021.Markdown
[Tran-The et al. "Bayesian Optimistic Optimisation with Exponentially Decaying Regret." International Conference on Machine Learning, 2021.](https://mlanthology.org/icml/2021/tranthe2021icml-bayesian/)BibTeX
@inproceedings{tranthe2021icml-bayesian,
title = {{Bayesian Optimistic Optimisation with Exponentially Decaying Regret}},
author = {Tran-The, Hung and Gupta, Sunil and Rana, Santu and Venkatesh, Svetha},
booktitle = {International Conference on Machine Learning},
year = {2021},
pages = {10390-10400},
volume = {139},
url = {https://mlanthology.org/icml/2021/tranthe2021icml-bayesian/}
}