AST-T5: Structure-Aware Pretraining for Code Generation and Understanding
Abstract
Large language models (LLMs) have made significant advancements in code-related tasks, yet many LLMs treat code as simple sequences, neglecting its structured nature. We introduce AST-T5, a novel pretraining paradigm that leverages the Abstract Syntax Tree (AST) for enhanced code generation, transpilation, and understanding. Using dynamic programming, our AST-Aware Segmentation retains code structure, while our AST-Aware Span Corruption objective equips the model to reconstruct various code structures. Unlike other models, AST-T5 avoids complex program analyses or architectural changes, so it integrates seamlessly with any encoder-decoder Transformer. Evaluations show that AST-T5 consistently outperforms similar-sized LMs across various code-related tasks including HumanEval and MBPP. Structure-awareness makes AST-T5 particularly powerful in code-to-code tasks, surpassing CodeT5 by 2 points in exact match score for the Bugs2Fix task and by 3 points in exact match score for Java-C# Transpilation in CodeXGLUE. Our code and model are publicly available at https://github.com/gonglinyuan/ast_t5.
Cite
Text
Gong et al. "AST-T5: Structure-Aware Pretraining for Code Generation and Understanding." International Conference on Machine Learning, 2024.Markdown
[Gong et al. "AST-T5: Structure-Aware Pretraining for Code Generation and Understanding." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/gong2024icml-astt5/)BibTeX
@inproceedings{gong2024icml-astt5,
title = {{AST-T5: Structure-Aware Pretraining for Code Generation and Understanding}},
author = {Gong, Linyuan and Elhoushi, Mostafa and Cheung, Alvin},
booktitle = {International Conference on Machine Learning},
year = {2024},
pages = {15839-15853},
volume = {235},
url = {https://mlanthology.org/icml/2024/gong2024icml-astt5/}
}