CoffeeBoost: Gradient Boosting Native Conformal Inference for Bayesian Optimization

Abstract

Bayesian optimization (BO) is a key technique for solving black-box optimization problems. This study extends the scope of BO from conventional applications (e.g., AutoML and robotics learning) to automated tuning of software systems. Despite GP (Gaussian Process) implementing a foundation formalism for exploitation and exploration in BO, its limited predictive power and unrealistic assumptions (e.g., continuity and Gaussianity) can severely affect its effectiveness and efficiency in tuning complex software systems. To overcome these limitations, we propose a BO framework CoffeeBoost, which implements exploitation and exploration with a GBDT-native distribution-free probabilistic surrogate model. CoffeeBoost constructs surrogate models via stochastic gradient boosting ensembles (SGBE) and quantifies probabilistic distributions via distribution-free conformal predictive systems. Moreover, CoffeeBoost leverages the residual paths in SGBE to improve the local adaptiveness of the resulting predictive distributions in a GBDT-native manner. Across eight auto-tuning benchmarks for database management systems (DBMS), we evaluate CoffeeBoost and show its superior learnability and optimizability against existing GP-based and tree-ensemble-based BO schemes. Detailed analysis further shows CoffeeBoost's predictive distributions excel in both coverage and tightness.

Cite

Text

Lai et al. "CoffeeBoost: Gradient Boosting Native Conformal Inference for Bayesian Optimization." AAAI Conference on Artificial Intelligence, 2025. doi:10.1609/AAAI.V39I17.33982

Markdown

[Lai et al. "CoffeeBoost: Gradient Boosting Native Conformal Inference for Bayesian Optimization." AAAI Conference on Artificial Intelligence, 2025.](https://mlanthology.org/aaai/2025/lai2025aaai-coffeeboost/) doi:10.1609/AAAI.V39I17.33982

BibTeX

@inproceedings{lai2025aaai-coffeeboost,
  title     = {{CoffeeBoost: Gradient Boosting Native Conformal Inference for Bayesian Optimization}},
  author    = {Lai, Yuanhao and Zheng, Pengfei and Ji, Chenpeng and Qiu, Cheng and Wang, Tingkai and Zhang, Songhan and Wang, Zhengang and Du, Yunfei},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2025},
  pages     = {18017-18025},
  doi       = {10.1609/AAAI.V39I17.33982},
  url       = {https://mlanthology.org/aaai/2025/lai2025aaai-coffeeboost/}
}