Scalable Acceleration for Classification-Based Derivative-Free Optimization
Abstract
Derivative-free optimization algorithms play an important role in scientific and engineering design optimization problems, especially when derivative information is not accessible. In this paper, we study the framework of sequential classification-based derivative-free optimization algorithms. By introducing learning theoretic concept hypothesis-target shattering rate, we revisit the computational complexity upper bound of SRACOS Inspired by the revisited upper bound, we propose an algorithm named RACE-CARS, which adds a random region-shrinking step compared with SRACOS. We further establish theorems showing the acceleration by region shrinking. Experiments on the synthetic functions as well as black-box tuning for language-model-as-a-service demonstrate empirically the efficiency of RACE-CARS. An ablation experiment on the introduced hyper-parameters is also conducted, revealing the mechanism of RACE-CARS and putting forward an empirical hyperparameter tuning guidance.
Cite
Text
Han et al. "Scalable Acceleration for Classification-Based Derivative-Free Optimization." AAAI Conference on Artificial Intelligence, 2025. doi:10.1609/AAAI.V39I16.33874Markdown
[Han et al. "Scalable Acceleration for Classification-Based Derivative-Free Optimization." AAAI Conference on Artificial Intelligence, 2025.](https://mlanthology.org/aaai/2025/han2025aaai-scalable/) doi:10.1609/AAAI.V39I16.33874BibTeX
@inproceedings{han2025aaai-scalable,
title = {{Scalable Acceleration for Classification-Based Derivative-Free Optimization}},
author = {Han, Tianyi and Li, Jingya and Guo, Zhipeng and Jin, Yuan},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2025},
pages = {17050-17058},
doi = {10.1609/AAAI.V39I16.33874},
url = {https://mlanthology.org/aaai/2025/han2025aaai-scalable/}
}