Code Representation Learning Using Prüfer Sequences (Student Abstract)
Abstract
An effective and efficient encoding of the source code of a computer program is critical to the success of sequence-to-sequence deep neural network models for code representation learning. In this study, we propose to use the Prufer sequence of the Abstract Syntax Tree (AST) of a computer program to design a sequential representation scheme that preserves the structural information in an AST. Our representation makes it possible to develop deep-learning models in which signals carried by lexical tokens in the training examples can be exploited automatically and selectively based on their syntactic role and importance. Unlike other recently-proposed approaches, our representation is concise and lossless in terms of the structural information of the AST. Results from our experiment show that prufer-sequence-based representation is indeed highly effective and efficient.
Cite
Text
Jinpa and Gao. "Code Representation Learning Using Prüfer Sequences (Student Abstract)." AAAI Conference on Artificial Intelligence, 2022. doi:10.1609/AAAI.V36I11.21625Markdown
[Jinpa and Gao. "Code Representation Learning Using Prüfer Sequences (Student Abstract)." AAAI Conference on Artificial Intelligence, 2022.](https://mlanthology.org/aaai/2022/jinpa2022aaai-code/) doi:10.1609/AAAI.V36I11.21625BibTeX
@inproceedings{jinpa2022aaai-code,
title = {{Code Representation Learning Using Prüfer Sequences (Student Abstract)}},
author = {Jinpa, Tenzin and Gao, Yong},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2022},
pages = {12977-12978},
doi = {10.1609/AAAI.V36I11.21625},
url = {https://mlanthology.org/aaai/2022/jinpa2022aaai-code/}
}