Efficient Distributionally Robust Bayesian Optimization with Worst-Case Sensitivity

Abstract

In distributionally robust Bayesian optimization (DRBO), an exact computation of the worst-case expected value requires solving an expensive convex optimization problem. We develop a fast approximation of the worst-case expected value based on the notion of worst-case sensitivity that caters to arbitrary convex distribution distances. We provide a regret bound for our novel DRBO algorithm with the fast approximation, and empirically show it is competitive with that using the exact worst-case expected value while incurring significantly less computation time. In order to guide the choice of distribution distance to be used with DRBO, we show that our approximation implicitly optimizes an objective close to an interpretable risk-sensitive value.

Cite

Text

Tay et al. "Efficient Distributionally Robust Bayesian Optimization with Worst-Case Sensitivity." International Conference on Machine Learning, 2022.

Markdown

[Tay et al. "Efficient Distributionally Robust Bayesian Optimization with Worst-Case Sensitivity." International Conference on Machine Learning, 2022.](https://mlanthology.org/icml/2022/tay2022icml-efficient/)

BibTeX

@inproceedings{tay2022icml-efficient,
  title     = {{Efficient Distributionally Robust Bayesian Optimization with Worst-Case Sensitivity}},
  author    = {Tay, Sebastian Shenghong and Foo, Chuan Sheng and Daisuke, Urano and Leong, Richalynn and Low, Bryan Kian Hsiang},
  booktitle = {International Conference on Machine Learning},
  year      = {2022},
  pages     = {21180-21204},
  volume    = {162},
  url       = {https://mlanthology.org/icml/2022/tay2022icml-efficient/}
}