Distributionally Robust Bayesian Optimization

Abstract

Robustness to distributional shift is one of the key challenges of contemporary machine learning. Attaining such robustness is the goal of distributionally robust optimization, which seeks a solution to an optimization problem that is worst-case robust under a specified distributional shift of an uncontrolled covariate. In this paper, we study such a problem when the distributional shift is measured via the maximum mean discrepancy (MMD). For the setting of zeroth-order, noisy optimization, we present a novel distributionally robust Bayesian optimization algorithm (DRBO). Our algorithm provably obtains sub-linear robust regret in various settings that differ in how the uncertain covariate is observed. We demonstrate the robust performance of our method on both synthetic and real-world benchmarks.

Cite

Text

Kirschner et al. "Distributionally Robust Bayesian Optimization." Artificial Intelligence and Statistics, 2020.

Markdown

[Kirschner et al. "Distributionally Robust Bayesian Optimization." Artificial Intelligence and Statistics, 2020.](https://mlanthology.org/aistats/2020/kirschner2020aistats-distributionally/)

BibTeX

@inproceedings{kirschner2020aistats-distributionally,
  title     = {{Distributionally Robust Bayesian Optimization}},
  author    = {Kirschner, Johannes and Bogunovic, Ilija and Jegelka, Stefanie and Krause, Andreas},
  booktitle = {Artificial Intelligence and Statistics},
  year      = {2020},
  pages     = {2174-2184},
  volume    = {108},
  url       = {https://mlanthology.org/aistats/2020/kirschner2020aistats-distributionally/}
}