Code Execution with Pre-trained Language Models

Chenxiao Liu, Shuai Lu, Weizhu Chen, Daxin Jiang, Alexey Svyatkovskiy, Shengyu Fu, Neel Sundaresan, Nan Duan


Abstract
Code execution is a fundamental aspect of programming language semantics that reflects the exact behavior of the code. However, most pre-trained models for code intelligence ignore the execution trace and only rely on source code and syntactic structures. In this paper, we investigate how well pre-trained models can understand and perform code execution. We develop a mutation-based data augmentation technique to create a large-scale and realistic Python dataset and task for code execution, which challenges existing models such as Codex. We then present CodeExecutor, a Transformer model that leverages code execution pre-training and curriculum learning to enhance its semantic comprehension. We evaluate CodeExecutor on code execution and show its promising performance and limitations. We also demonstrate its potential benefits for code intelligence tasks such as zero-shot code-to-code search and text-to-code generation. Our analysis provides insights into the learning and generalization abilities of pre-trained models for code execution.
Anthology ID:
2023.findings-acl.308
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4984–4999
Language:
URL:
https://aclanthology.org/2023.findings-acl.308
DOI:
10.18653/v1/2023.findings-acl.308
Bibkey:
Cite (ACL):
Chenxiao Liu, Shuai Lu, Weizhu Chen, Daxin Jiang, Alexey Svyatkovskiy, Shengyu Fu, Neel Sundaresan, and Nan Duan. 2023. Code Execution with Pre-trained Language Models. In Findings of the Association for Computational Linguistics: ACL 2023, pages 4984–4999, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Code Execution with Pre-trained Language Models (Liu et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.308.pdf
Video:
 https://aclanthology.org/2023.findings-acl.308.mp4