SharPT: Shared Latent Space Prompt Tuning

Bo Pang, Semih Yavuz, Caiming Xiong, Yingbo Zhou


Abstract
Prompt tuning is an efficient method for adapting large language models, and Soft Prompt Transfer (SPoT) further narrows the gap between prompt tuning and full model tuning by transferring prompts learned from source tasks to target tasks. It is nevertheless difficult and expensive to identify the source task that provides optimal prompts. In this work, we propose to learn a shared latent space which captures a set of basis skills from a mixture of source tasks. Given an instance, its embedding queries the latent space, yielding a basis skill vector. This vector generates soft prompts, via a lightweight prompt generator, which modulates a frozen model. The latent space and prompt transformation are learned end-to-end by training on source tasks. Transfer learning from source tasks to a target task simply amounts to finetuning the prompt generator, accounting for roughly 0.3% parameters of the frozen backbone model, while the shared latent space is also frozen in finetuning. Our approach outperforms prior soft prompt methods by a significant margin on a variety of tasks such as NLI, sentence completion, QA, conference resolution, word sense disambiguation. We also find, on various model scales, our method achieves competitive performance compared to finetuning the full model.
Anthology ID:
2023.findings-eacl.92
Volume:
Findings of the Association for Computational Linguistics: EACL 2023
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1244–1250
Language:
URL:
https://aclanthology.org/2023.findings-eacl.92
DOI:
10.18653/v1/2023.findings-eacl.92
Bibkey:
Cite (ACL):
Bo Pang, Semih Yavuz, Caiming Xiong, and Yingbo Zhou. 2023. SharPT: Shared Latent Space Prompt Tuning. In Findings of the Association for Computational Linguistics: EACL 2023, pages 1244–1250, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
SharPT: Shared Latent Space Prompt Tuning (Pang et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-eacl.92.pdf