Diffusion Models for Black-Box Optimization

Abstract

The goal of offline black-box optimization (BBO) is to optimize an expensive black-box function using a fixed dataset of function evaluations. Prior works consider forward approaches that learn surrogates to the black-box function and inverse approaches that directly map function values to corresponding points in the input domain of the black-box function. These approaches are limited by the quality of the offline dataset and the difficulty in learning one-to-many mappings in high dimensions, respectively. We propose Denoising Diffusion Optimization Models (DDOM), a new inverse approach for offline black-box optimization based on diffusion models. Given an offline dataset, DDOM learns a conditional generative model over the domain of the black-box function conditioned on the function values. We investigate several design choices in DDOM, such as reweighting the dataset to focus on high function values and the use of classifier-free guidance at test-time to enable generalization to function values that can even exceed the dataset maxima. Empirically, we conduct experiments on the Design-Bench benchmark (Trabucco et al., 2022) and show that DDOM achieves results competitive with state-of-the-art baselines.

Cite

Text

Krishnamoorthy et al. "Diffusion Models for Black-Box Optimization." International Conference on Machine Learning, 2023.

Markdown

[Krishnamoorthy et al. "Diffusion Models for Black-Box Optimization." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/krishnamoorthy2023icml-diffusion/)

BibTeX

@inproceedings{krishnamoorthy2023icml-diffusion,
  title     = {{Diffusion Models for Black-Box Optimization}},
  author    = {Krishnamoorthy, Siddarth and Mashkaria, Satvik Mehul and Grover, Aditya},
  booktitle = {International Conference on Machine Learning},
  year      = {2023},
  pages     = {17842-17857},
  volume    = {202},
  url       = {https://mlanthology.org/icml/2023/krishnamoorthy2023icml-diffusion/}
}