Sub-Linear Regret Bounds for Bayesian Optimisation in Unknown Search Spaces

Abstract

Bayesian optimisation is a popular method for efficient optimisation of expensive black-box functions. Traditionally, BO assumes that the search space is known. However, in many problems, this assumption does not hold. To this end, we propose a novel BO algorithm which expands (and shifts) the search space over iterations based on controlling the expansion rate thought a \emph{hyperharmonic series}. Further, we propose another variant of our algorithm that scales to high dimensions. We show theoretically that for both our algorithms, the cumulative regret grows at sub-linear rates. Our experiments with synthetic and real-world optimisation tasks demonstrate the superiority of our algorithms over the current state-of-the-art methods for Bayesian optimisation in unknown search space.

Cite

Text

Tran-The et al. "Sub-Linear Regret Bounds for Bayesian Optimisation in Unknown Search Spaces." Neural Information Processing Systems, 2020.

Markdown

[Tran-The et al. "Sub-Linear Regret Bounds for Bayesian Optimisation in Unknown Search Spaces." Neural Information Processing Systems, 2020.](https://mlanthology.org/neurips/2020/tranthe2020neurips-sublinear/)

BibTeX

@inproceedings{tranthe2020neurips-sublinear,
  title     = {{Sub-Linear Regret Bounds for Bayesian Optimisation in Unknown Search Spaces}},
  author    = {Tran-The, Hung and Gupta, Sunil and Rana, Santu and Ha, Huong and Venkatesh, Svetha},
  booktitle = {Neural Information Processing Systems},
  year      = {2020},
  url       = {https://mlanthology.org/neurips/2020/tranthe2020neurips-sublinear/}
}