CodePrompt: Task-Agnostic Prefix Tuning for Program and Language Generation

YunSeok Choi, Jee-Hyong Lee


Abstract
In order to solve the inefficient parameter update and storage issues of fine-tuning in Natural Language Generation (NLG) tasks, prompt-tuning methods have emerged as lightweight alternatives. Furthermore, efforts to reduce the gap between pre-training and fine-tuning have shown successful results in low-resource settings. As large Pre-trained Language Models (PLMs) for Program and Language Generation (PLG) tasks are constantly being developed, prompt tuning methods are necessary for the tasks. However, due to the gap between pre-training and fine-tuning different from PLMs for natural language, a prompt tuning method that reflects the traits of PLM for program language is needed. In this paper, we propose a Task-Agnostic prompt tuning method for the PLG tasks, CodePrompt, that combines Input-Dependent Prompt Template (to bridge the gap between pre-training and fine-tuning of PLMs for program and language) and Corpus-Specific Prefix Tuning (to update the parameters of PLMs for program and language efficiently).Also, we propose a method to provide richer prefix word information for limited prefix lengths. We prove that our method is effective in three PLG tasks, not only in the full-data setting but also in the low-resource setting and cross-domain setting.
Anthology ID:
2023.findings-acl.325
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5282–5297
Language:
URL:
https://aclanthology.org/2023.findings-acl.325
DOI:
10.18653/v1/2023.findings-acl.325
Bibkey:
Cite (ACL):
YunSeok Choi and Jee-Hyong Lee. 2023. CodePrompt: Task-Agnostic Prefix Tuning for Program and Language Generation. In Findings of the Association for Computational Linguistics: ACL 2023, pages 5282–5297, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
CodePrompt: Task-Agnostic Prefix Tuning for Program and Language Generation (Choi & Lee, Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.325.pdf
Video:
 https://aclanthology.org/2023.findings-acl.325.mp4