Convergence Rates of Constrained Expected Improvement

Abstract

Constrained Bayesian optimization (CBO) methods have seen significant success in black-box optimization with constraints. One of the most commonly used CBO methods is the constrained expected improvement (CEI) algorithm. CEI is a natural extension of expected improvement (EI) when constraints are incorporated. However, the theoretical convergence rate of CEI has not been established. In this work, we study the convergence rate of CEI by analyzing its simple regret upper bound. First, we show that when the objective function $f$ and constraint function $c$ are assumed to each lie in a reproducing kernel Hilbert space (RKHS), CEI achieves the convergence rates of $\mathcal{O} \left(t^{-\frac{1}{2}}\log^{\frac{d+1}{2}}(t) \right) \ \text{and }\ \mathcal{O}\left(t^{\frac{-\nu}{2\nu+d}} \log^{\frac{\nu}{2\nu+d}}(t)\right)$ for the commonly used squared exponential and Matérn kernels, respectively. Second, we show that when $f$ is assumed to be sampled from Gaussian processes (GPs), CEI achieves similar convergence rates with a high probability. Numerical experiments are performed to validate the theoretical analysis.

Cite

Text

Wang et al. "Convergence Rates of Constrained Expected Improvement." Advances in Neural Information Processing Systems, 2025.

Markdown

[Wang et al. "Convergence Rates of Constrained Expected Improvement." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/wang2025neurips-convergence/)

BibTeX

@inproceedings{wang2025neurips-convergence,
  title     = {{Convergence Rates of Constrained Expected Improvement}},
  author    = {Wang, Haowei and Wang, Jingyi and Dai, Zhongxiang and Chiang, Nai-Yuan and Ng, Szu Hui and Petra, Cosmin G.},
  booktitle = {Advances in Neural Information Processing Systems},
  year      = {2025},
  url       = {https://mlanthology.org/neurips/2025/wang2025neurips-convergence/}
}