PF$^2$ES: Parallel Feasible Pareto Frontier Entropy Search for Multi-Objective Bayesian Optimization

Abstract

We present Parallel Feasible Pareto Frontier Entropy Search ($\{\mathrm{PF}\}^2$ES) — a novel information-theoretic acquisition function for multi-objective Bayesian optimization supporting unknown constraints and batch queries. Due to the complexity of characterizing the mutual information between candidate evaluations and (feasible) Pareto frontiers, existing approaches must either employ crude approximations that significantly hamper their performance or rely on expensive inference schemes that substantially increase the optimization’s computational overhead. By instead using a variational lower bound, $\{\mathrm{PF}\}^2$ES provides a low-cost and accurate estimate of the mutual information. We benchmark $\{\mathrm{PF}\}^2$ES against other information-theoretic acquisition functions, demonstrating its competitive performance for optimization across synthetic and real-world design problems.

Cite

Text

Qing et al. "PF$^2$ES: Parallel Feasible Pareto Frontier Entropy Search for Multi-Objective Bayesian Optimization." Artificial Intelligence and Statistics, 2023.

Markdown

[Qing et al. "PF$^2$ES: Parallel Feasible Pareto Frontier Entropy Search for Multi-Objective Bayesian Optimization." Artificial Intelligence and Statistics, 2023.](https://mlanthology.org/aistats/2023/qing2023aistats-pf/)

BibTeX

@inproceedings{qing2023aistats-pf,
  title     = {{PF$^2$ES: Parallel Feasible Pareto Frontier Entropy Search for Multi-Objective Bayesian Optimization}},
  author    = {Qing, Jixiang and Moss, Henry B. and Dhaene, Tom and Couckuyt, Ivo},
  booktitle = {Artificial Intelligence and Statistics},
  year      = {2023},
  pages     = {2565-2588},
  volume    = {206},
  url       = {https://mlanthology.org/aistats/2023/qing2023aistats-pf/}
}