High-Dimensional Bayesian Optimization via Nested Riemannian Manifolds

Abstract

Despite the recent success of Bayesian optimization (BO) in a variety of applications where sample efficiency is imperative, its performance may be seriously compromised in settings characterized by high-dimensional parameter spaces. A solution to preserve the sample efficiency of BO in such problems is to introduce domain knowledge into its formulation. In this paper, we propose to exploit the geometry of non-Euclidean search spaces, which often arise in a variety of domains, to learn structure-preserving mappings and optimize the acquisition function of BO in low-dimensional latent spaces. Our approach, built on Riemannian manifolds theory, features geometry-aware Gaussian processes that jointly learn a nested-manifolds embedding and a representation of the objective function in the latent space. We test our approach in several benchmark artificial landscapes and report that it not only outperforms other high-dimensional BO approaches in several settings, but consistently optimizes the objective functions, as opposed to geometry-unaware BO methods.

Cite

Text

Jaquier and Rozo. "High-Dimensional Bayesian Optimization via Nested Riemannian Manifolds." Neural Information Processing Systems, 2020.

Markdown

[Jaquier and Rozo. "High-Dimensional Bayesian Optimization via Nested Riemannian Manifolds." Neural Information Processing Systems, 2020.](https://mlanthology.org/neurips/2020/jaquier2020neurips-highdimensional/)

BibTeX

@inproceedings{jaquier2020neurips-highdimensional,
  title     = {{High-Dimensional Bayesian Optimization via Nested Riemannian Manifolds}},
  author    = {Jaquier, Noémie and Rozo, Leonel},
  booktitle = {Neural Information Processing Systems},
  year      = {2020},
  url       = {https://mlanthology.org/neurips/2020/jaquier2020neurips-highdimensional/}
}