Neural Nonmyopic Bayesian Optimization in Dynamic Cost Settings
Abstract
Bayesian optimization (BO) is a popular framework for optimizing black-box functions, leveraging probabilistic models such as Gaussian processes. Conventional BO algorithms, however, assume static query costs, which limit their applicability to real-world problems with dynamic cost structures such as geological surveys or biological sequence design, where query costs vary based on the previous actions. We propose a novel nonmyopic BO algorithm named LookaHES featuring dynamic cost models to address this. LookaHES employs a neural network policy for variational optimization over multi-step lookahead horizons to enable planning under dynamic cost environments. Empirically, we benchmark LookaHES on synthetic functions exhibiting varied dynamic cost structures. We subsequently apply LookaHES to a real-world application in protein sequence design using a large language model policy, demonstrating its scalability and effectiveness in handling multi-step planning in a large and complex query space. LookaHES consistently outperforms its myopic counterparts in synthetic and real-world settings, significantly improving efficiency and solution quality. Our implementation is available at https://github.com/sangttruong/nonmyopia.
Cite
Text
Truong et al. "Neural Nonmyopic Bayesian Optimization in Dynamic Cost Settings." ICLR 2025 Workshops: AgenticAI, 2025.Markdown
[Truong et al. "Neural Nonmyopic Bayesian Optimization in Dynamic Cost Settings." ICLR 2025 Workshops: AgenticAI, 2025.](https://mlanthology.org/iclrw/2025/truong2025iclrw-neural/)BibTeX
@inproceedings{truong2025iclrw-neural,
title = {{Neural Nonmyopic Bayesian Optimization in Dynamic Cost Settings}},
author = {Truong, Sang T. and Nguyen, Duc Quang and Neiswanger, Willie and Griffiths, Ryan-Rhys and Ermon, Stefano and Haber, Nick and Koyejo, Sanmi},
booktitle = {ICLR 2025 Workshops: AgenticAI},
year = {2025},
url = {https://mlanthology.org/iclrw/2025/truong2025iclrw-neural/}
}