SFS: Smarter Code Space Search Improves LLM Inference Scaling
Abstract
We frame code generation as a black-box optimization problem within the code space and demonstrate how optimization-inspired techniques can enhance inference scaling over text. Based on this perspective, we propose **SCATTERED FOREST SEARCH (SFS)**, a novel approach that improves solution diversity during evolutionary search, thereby avoiding local optima. Our theoretical analysis illustrates how these methods improve exploration and enhance efficiency. Extensive experiments on *HumanEval, MBPP, APPS, CodeContests,* and *Leetcode* reveal significant performance gains. For instance, our method achieves a **pass@1 rate of 67.1% on HumanEval+** and **87.2% on HumanEval with GPT-3.5**, marking improvements of **8.6%** and **4.3%** over the state-of-the-art, while also halving the iterations needed to find the correct solution. Furthermore, our approach scales more efficiently than existing search techniques, including **tree search, line search,** and **repeated sampling (Best of N)**.
Cite
Text
Light et al. "SFS: Smarter Code Space Search Improves LLM Inference Scaling." International Conference on Learning Representations, 2025.Markdown
[Light et al. "SFS: Smarter Code Space Search Improves LLM Inference Scaling." International Conference on Learning Representations, 2025.](https://mlanthology.org/iclr/2025/light2025iclr-sfs/)BibTeX
@inproceedings{light2025iclr-sfs,
title = {{SFS: Smarter Code Space Search Improves LLM Inference Scaling}},
author = {Light, Jonathan and Wu, Yue and Sun, Yiyou and Yu, Wenchao and Liu, Yanchi and Zhao, Xujiang and Hu, Ziniu and Chen, Haifeng and Cheng, Wei},
booktitle = {International Conference on Learning Representations},
year = {2025},
url = {https://mlanthology.org/iclr/2025/light2025iclr-sfs/}
}