Syntax-Aware Retrieval Augmented Code Generation

Xiangyu Zhang, Yu Zhou, Guang Yang, Taolue Chen


Abstract
Neural code generation models are nowadays widely adopted to generate code from natural language descriptions automatically. Recently, pre-trained neural models equipped with token-level retrieval capabilities have exhibited great potentials in neural machine translation. However, applying them directly to code generation experience challenges: the use of the retrieval-based mechanism inevitably introduces extraneous noise to the generation process, resulting in even syntactically incorrect code. Computationally, such models necessitate frequent searches of the cached datastore, which turns out to be time-consuming. To address these issues, we propose kNN-TRANX, a token-level retrieval augmented code generation method. kNN-TRANX allows for searches in smaller datastores tailored for the code generation task. It leverages syntax constraints for the retrieval of datastores, which reduces the impact of retrieve noise. We evaluate kNN-TRANX on two public datasets and the experimental results confirm the effectiveness of our approach.
Anthology ID:
2023.findings-emnlp.90
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1291–1302
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.90
DOI:
10.18653/v1/2023.findings-emnlp.90
Bibkey:
Cite (ACL):
Xiangyu Zhang, Yu Zhou, Guang Yang, and Taolue Chen. 2023. Syntax-Aware Retrieval Augmented Code Generation. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 1291–1302, Singapore. Association for Computational Linguistics.
Cite (Informal):
Syntax-Aware Retrieval Augmented Code Generation (Zhang et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.90.pdf