Prompt Contrastive Transformation: An Enhanced Strategy for Efficient Prompt Transfer in Natural Language Processing

Shu Zhao, Shiji Yang, Shicheng Tan, Zhen Yang, Congyao Mei, Zhen Duan, Yanping Zhang, Jie Chen


Abstract
Prompt transfer is a transfer learning method based on prompt tuning, which enhances the parameter performance of prompts in target tasks by transferring source prompt embeddings. Among existing methods, weighted aggregation is effective and possesses the advantages of being lightweight and modular. However, these methods may transfer redundant or irrelevant information from the source prompts to the target prompt, leading to negative impacts. To alleviate this problem, we propose Prompt Contrastive Transformation (PCT), which achieves efficient prompt transfer through prompt contrastive transformation and attentional fusion. PCT transforms the source prompt into task-agnostic embedding and task-specific embeddings through singular value decomposition and contrastive learning, reducing information redundancy among source prompts. The attention module in PCT selects more effective task-specific embeddings and fuses them with task-agnostic embedding into the target prompt. Experimental results show that, despite tuning only 0.035% of task-specific parameters, PCT achieves improvements in prompt transfer for single target task adaptation across various NLP tasks.
Anthology ID:
2025.tacl-1.39
Volume:
Transactions of the Association for Computational Linguistics, Volume 13
Month:
Year:
2025
Address:
Cambridge, MA
Venue:
TACL
SIG:
Publisher:
MIT Press
Note:
Pages:
848–860
Language:
URL:
https://aclanthology.org/2025.tacl-1.39/
DOI:
10.1162/tacl.a.22
Bibkey:
Cite (ACL):
Shu Zhao, Shiji Yang, Shicheng Tan, Zhen Yang, Congyao Mei, Zhen Duan, Yanping Zhang, and Jie Chen. 2025. Prompt Contrastive Transformation: An Enhanced Strategy for Efficient Prompt Transfer in Natural Language Processing. Transactions of the Association for Computational Linguistics, 13:848–860.
Cite (Informal):
Prompt Contrastive Transformation: An Enhanced Strategy for Efficient Prompt Transfer in Natural Language Processing (Zhao et al., TACL 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.tacl-1.39.pdf