BayeSQP: Bayesian Optimization Through Sequential Quadratic Programming
Abstract
We introduce BayeSQP, a novel algorithm for general black-box optimization that merges the structure of sequential quadratic programming with concepts from Bayesian optimization. BayeSQP employs second-order Gaussian process surrogates for both the objective and constraints to jointly model the function values, gradients, and Hessian from only zero-order information. At each iteration, a local subproblem is constructed using the GP posterior estimates and solved to obtain a search direction. Crucially, the formulation of the subproblem explicitly incorporates uncertainty in both the function and derivative estimates, resulting in a tractable second-order cone program for high probability improvements under model uncertainty. A subsequent one-dimensional line search via constrained Thompson sampling selects the next evaluation point. Empirical results show that BayeSQP outperforms state-of-the-art methods in specific high-dimensional settings. Our algorithm offers a principled and flexible framework that bridges classical optimization techniques with modern approaches to black-box optimization.
Cite
Text
Brunzema and Trimpe. "BayeSQP: Bayesian Optimization Through Sequential Quadratic Programming." Advances in Neural Information Processing Systems, 2025.Markdown
[Brunzema and Trimpe. "BayeSQP: Bayesian Optimization Through Sequential Quadratic Programming." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/brunzema2025neurips-bayesqp/)BibTeX
@inproceedings{brunzema2025neurips-bayesqp,
title = {{BayeSQP: Bayesian Optimization Through Sequential Quadratic Programming}},
author = {Brunzema, Paul and Trimpe, Sebastian},
booktitle = {Advances in Neural Information Processing Systems},
year = {2025},
url = {https://mlanthology.org/neurips/2025/brunzema2025neurips-bayesqp/}
}