Triangulation Candidates for Bayesian Optimization
Abstract
Bayesian optimization involves "inner optimization" over a new-data acquisition criterion which is non-convex/highly multi-modal, may be non-differentiable, or may otherwise thwart local numerical optimizers. In such cases it is common to replace continuous search with a discrete one over random candidates. Here we propose using candidates based on a Delaunay triangulation of the existing input design. We detail the construction of these "tricands" and demonstrate empirically how they outperform both numerically optimized acquisitions and random candidate-based alternatives, and are well-suited for hybrid schemes, on benchmark synthetic and real simulation experiments.
Cite
Text
Gramacy et al. "Triangulation Candidates for Bayesian Optimization." Neural Information Processing Systems, 2022.Markdown
[Gramacy et al. "Triangulation Candidates for Bayesian Optimization." Neural Information Processing Systems, 2022.](https://mlanthology.org/neurips/2022/gramacy2022neurips-triangulation/)BibTeX
@inproceedings{gramacy2022neurips-triangulation,
title = {{Triangulation Candidates for Bayesian Optimization}},
author = {Gramacy, Robert B. and Sauer, Annie and Wycoff, Nathan},
booktitle = {Neural Information Processing Systems},
year = {2022},
url = {https://mlanthology.org/neurips/2022/gramacy2022neurips-triangulation/}
}