Learning Relevant Contextual Variables Within Bayesian Optimization

Abstract

Contextual Bayesian Optimization (CBO) efficiently optimizes black-box, expensive-to- evaluate functions with respect to design variables, while simultaneously integrating relevant contextual information regarding the environment, such as experimental conditions. However, the relevance of contextual variables is not necessarily known beforehand. Moreover, contextual variables can sometimes be optimized themselves, an overlooked setting by current CBO algorithms. Optimizing contextual variables may be costly, which raises the question of determining a minimal relevant subset. We address this problem using a novel method, Sensitivity-Analysis-Driven Contextual BO (SADCBO). We learn the relevance of context variables by sensitivity analysis of the posterior surrogate model, whilst minimizing the cost of optimization by leveraging recent developments on early stopping for BO. We empirically evaluate our proposed SADCBO against alternatives on both synthetic and real-world experiments, and demonstrate a consistent improvement across examples.

Cite

Text

Martinelli et al. "Learning Relevant Contextual Variables Within Bayesian Optimization." NeurIPS 2023 Workshops: ReALML, 2023.

Markdown

[Martinelli et al. "Learning Relevant Contextual Variables Within Bayesian Optimization." NeurIPS 2023 Workshops: ReALML, 2023.](https://mlanthology.org/neuripsw/2023/martinelli2023neuripsw-learning/)

BibTeX

@inproceedings{martinelli2023neuripsw-learning,
  title     = {{Learning Relevant Contextual Variables Within Bayesian Optimization}},
  author    = {Martinelli, Julien and Bharti, Ayush and Tiihonen, Armi and Filstroff, Louis and John, S. T. and Sloman, Sabina J. and Rinke, Patrick and Kaski, Samuel},
  booktitle = {NeurIPS 2023 Workshops: ReALML},
  year      = {2023},
  url       = {https://mlanthology.org/neuripsw/2023/martinelli2023neuripsw-learning/}
}