Tram: A Token-level Retrieval-augmented Mechanism for Source Code Summarization

Tong Ye, Lingfei Wu, Tengfei Ma, Xuhong Zhang, Yangkai Du, Peiyu Liu, Shouling Ji, Wenhai Wang


Abstract
Automatically generating human-readable text describing the functionality of a program is the intent of source code summarization. Although neural language models achieve significant performance in this field, they are limited by their inability to access external knowledge. To address this limitation, an emerging trend is combining neural models with external knowledge through retrieval methods. Previous methods have relied on the sentence-level retrieval paradigm on the encoder side. However, this paradigm is coarse-grained, noise-filled and cannot directly take advantage of the high-quality retrieved summary tokens on the decoder side. In this paper, we propose a fine-grained Token-level retrieval-augmented mechanism (Tram) on the decoder side rather than the encoder side to enhance the performance of neural models and produce more low-frequency tokens in generating summaries. Furthermore, to overcome the challenge of token-level retrieval in capturing contextual code semantics, we also propose integrating code semantics into individual summary tokens. The results of extensive experiments and human evaluation show that our token-level retrieval-augmented approach significantly improves performance and is more interpretable.
Anthology ID:
2024.findings-naacl.186
Volume:
Findings of the Association for Computational Linguistics: NAACL 2024
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2959–2971
Language:
URL:
https://aclanthology.org/2024.findings-naacl.186
DOI:
10.18653/v1/2024.findings-naacl.186
Bibkey:
Cite (ACL):
Tong Ye, Lingfei Wu, Tengfei Ma, Xuhong Zhang, Yangkai Du, Peiyu Liu, Shouling Ji, and Wenhai Wang. 2024. Tram: A Token-level Retrieval-augmented Mechanism for Source Code Summarization. In Findings of the Association for Computational Linguistics: NAACL 2024, pages 2959–2971, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
Tram: A Token-level Retrieval-augmented Mechanism for Source Code Summarization (Ye et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-naacl.186.pdf