A Framework for Bayesian Optimization in Embedded Subspaces

Abstract

We present a theoretically founded approach for high-dimensional Bayesian optimization based on low-dimensional subspace embeddings. We prove that the error in the Gaussian process model is bounded tightly when going from the original high-dimensional search domain to the low-dimensional embedding. This implies that the optimization process in the low-dimensional embedding proceeds essentially as if it were run directly on an unknown active subspace of low dimensionality. The argument applies to a large class of algorithms and GP models, including non-stationary kernels. Moreover, we provide an efficient implementation based on hashing and demonstrate empirically that this subspace embedding achieves considerably better results than the previously proposed methods for high-dimensional BO based on Gaussian matrix projections and structure-learning.

Cite

Text

Nayebi et al. "A Framework for Bayesian Optimization in Embedded Subspaces." International Conference on Machine Learning, 2019.

Markdown

[Nayebi et al. "A Framework for Bayesian Optimization in Embedded Subspaces." International Conference on Machine Learning, 2019.](https://mlanthology.org/icml/2019/nayebi2019icml-framework/)

BibTeX

@inproceedings{nayebi2019icml-framework,
  title     = {{A Framework for Bayesian Optimization in Embedded Subspaces}},
  author    = {Nayebi, Amin and Munteanu, Alexander and Poloczek, Matthias},
  booktitle = {International Conference on Machine Learning},
  year      = {2019},
  pages     = {4752-4761},
  volume    = {97},
  url       = {https://mlanthology.org/icml/2019/nayebi2019icml-framework/}
}