Pareto Set Learning for Expensive Multi-Objective Optimization

Abstract

Expensive multi-objective optimization problems can be found in many real-world applications, where their objective function evaluations involve expensive computations or physical experiments. It is desirable to obtain an approximate Pareto front with a limited evaluation budget. Multi-objective Bayesian optimization (MOBO) has been widely used for finding a finite set of Pareto optimal solutions. However, it is well-known that the whole Pareto set is on a continuous manifold and can contain infinite solutions. The structural properties of the Pareto set are not well exploited in existing MOBO methods, and the finite-set approximation may not contain the most preferred solution(s) for decision-makers. This paper develops a novel learning-based method to approximate the whole Pareto set for MOBO, which generalizes the decomposition-based multi-objective optimization algorithm (MOEA/D) from finite populations to models. We design a simple and powerful acquisition search method based on the learned Pareto set, which naturally supports batch evaluation. In addition, with our proposed model, decision-makers can readily explore any trade-off area in the approximate Pareto set for flexible decision-making. This work represents the first attempt to model the Pareto set for expensive multi-objective optimization. Experimental results on different synthetic and real-world problems demonstrate the effectiveness of our proposed method.

Cite

Text

Lin et al. "Pareto Set Learning for Expensive Multi-Objective Optimization." Neural Information Processing Systems, 2022.

Markdown

[Lin et al. "Pareto Set Learning for Expensive Multi-Objective Optimization." Neural Information Processing Systems, 2022.](https://mlanthology.org/neurips/2022/lin2022neurips-pareto/)

BibTeX

@inproceedings{lin2022neurips-pareto,
  title     = {{Pareto Set Learning for Expensive Multi-Objective Optimization}},
  author    = {Lin, Xi and Yang, Zhiyuan and Zhang, Xiaoyuan and Zhang, Qingfu},
  booktitle = {Neural Information Processing Systems},
  year      = {2022},
  url       = {https://mlanthology.org/neurips/2022/lin2022neurips-pareto/}
}