Function-to-Style Guidance of LLMs for Code Translation

Abstract

Large language models (LLMs) have made significant strides in code translation tasks. However, ensuring both the correctness and readability of translated code remains a challenge, limiting their effective adoption in real-world software development. In this work, we propose F2STrans, a function-to-style guiding paradigm designed to progressively improve the performance of LLMs in code translation. Our approach comprises two key stages: (1) Functional learning, which optimizes translation correctness using high-quality source-target code pairs mined from online programming platforms, and (2) Style learning, which improves translation readability by incorporating both positive and negative style examples. Additionally, we introduce a novel code translation benchmark that includes up-to-date source code, extensive test cases, and manually annotated ground-truth translations, enabling comprehensive functional and stylistic evaluations. Experiments on both our new benchmark and existing datasets demonstrate that our approach significantly improves code translation performance. Notably, our approach enables Qwen-1.5B to outperform prompt-enhanced Qwen-32B and GPT-4 on average across 20 diverse code translation scenarios.

Cite

Text

Zhang et al. "Function-to-Style Guidance of LLMs for Code Translation." Proceedings of the 42nd International Conference on Machine Learning, 2025.

Markdown

[Zhang et al. "Function-to-Style Guidance of LLMs for Code Translation." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/zhang2025icml-functiontostyle/)

BibTeX

@inproceedings{zhang2025icml-functiontostyle,
  title     = {{Function-to-Style Guidance of LLMs for Code Translation}},
  author    = {Zhang, Longhui and Wang, Bin and Wang, Jiahao and Zhao, Xiaofeng and Zhang, Min and Yang, Hao and Zhang, Meishan and Li, Yu and Li, Jing and Yu, Jun and Zhang, Min},
  booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
  year      = {2025},
  pages     = {76273-76288},
  volume    = {267},
  url       = {https://mlanthology.org/icml/2025/zhang2025icml-functiontostyle/}
}