Offline Model-Based Optimization via Normalized Maximum Likelihood Estimation

Abstract

In this work we consider data-driven optimization problems where one must maximize a function given only queries at a fixed set of points. This problem setting emerges in many domains where function evaluation is a complex and expensive process, such as in the design of materials, vehicles, or neural network architectures. Because the available data typically only covers a small manifold of the possible space of inputs, a principal challenge is to be able to construct algorithms that can reason about uncertainty and out-of-distribution values, since a naive optimizer can easily exploit an estimated model to return adversarial inputs. We propose to tackle the MBO problem by leveraging the normalized maximum-likelihood (NML) estimator, which provides a principled approach to handling uncertainty and out-of-distribution inputs. While in the standard formulation NML is intractable, we propose a tractable approximation that allows us to scale our method to high-capacity neural network models. We demonstrate that our method can effectively optimize high-dimensional design problems in a variety of disciplines such as chemistry, biology, and materials engineering.

Cite

Text

Fu and Levine. "Offline Model-Based Optimization via Normalized Maximum Likelihood Estimation." International Conference on Learning Representations, 2021.

Markdown

[Fu and Levine. "Offline Model-Based Optimization via Normalized Maximum Likelihood Estimation." International Conference on Learning Representations, 2021.](https://mlanthology.org/iclr/2021/fu2021iclr-offline/)

BibTeX

@inproceedings{fu2021iclr-offline,
  title     = {{Offline Model-Based Optimization via Normalized Maximum Likelihood Estimation}},
  author    = {Fu, Justin and Levine, Sergey},
  booktitle = {International Conference on Learning Representations},
  year      = {2021},
  url       = {https://mlanthology.org/iclr/2021/fu2021iclr-offline/}
}