Two-Stage Kernel Bayesian Optimization in High Dimensions
Abstract
Bayesian optimization is a popular method for optimizing expensive black-box functions. Yet it oftentimes struggles in high dimensions, where the computation could be prohibitively heavy. While a complex kernel with many length scales is prone to overfitting and expensive to train, a simple coarse kernel with too few length scales cannot effectively capture the variations of the high dimensional function in different directions. To alleviate this problem, we introduce CobBO: a Bayesian optimization algorithm with two-stage kernels and a coordinate backoff stopping rule. It adaptively selects a promising low dimensional subspace and projects past measurements into it using a computational efficient coarse kernel. Within the subspace, the computational cost of conducting Bayesian optimization with a more flexible and accurate kernel becomes affordable and thus a sequence of consecutive observations in the same subspace are collected until a stopping rule is met. Extensive evaluations show that CobBO finds solutions comparable to or better than other state-of-the-art methods for dimensions ranging from tens to hundreds, while reducing both the trial complexity and computational costs.
Cite
Text
Tan and Nayman. "Two-Stage Kernel Bayesian Optimization in High Dimensions." Uncertainty in Artificial Intelligence, 2023.Markdown
[Tan and Nayman. "Two-Stage Kernel Bayesian Optimization in High Dimensions." Uncertainty in Artificial Intelligence, 2023.](https://mlanthology.org/uai/2023/tan2023uai-twostage/)BibTeX
@inproceedings{tan2023uai-twostage,
title = {{Two-Stage Kernel Bayesian Optimization in High Dimensions}},
author = {Tan, Jian and Nayman, Niv},
booktitle = {Uncertainty in Artificial Intelligence},
year = {2023},
pages = {2099-2110},
volume = {216},
url = {https://mlanthology.org/uai/2023/tan2023uai-twostage/}
}