Bayesian Multi-Task Transfer Learning for Soft Prompt Tuning

Haeju Lee, Minchan Jeong, Se-Young Yun, Kee-Eung Kim


Abstract
Prompt tuning, in which prompts are optimized to adapt large-scale pre-trained language models to downstream tasks instead of fine-tuning the full model parameters, has been shown to be particularly effective when the prompts are trained in the multi-task transfer learning setting. These methods generally involve individually training prompts for each source task and then aggregating them to provide the initialization of the prompt for the target task. However, this approach critically ignores the fact that some of the source tasks could be negatively or positively interfering with each other. We argue that when we extract knowledge from source tasks via training source prompts, we need to consider this correlation among source tasks for better transfer to target tasks. To this end, we propose a Bayesian approach where we work with the posterior distribution of prompts across source tasks. We obtain representative source prompts corresponding to the samples from the posterior utilizing Stein Variational Gradient Descent, which are then aggregated to constitute the initial target prompt. We show extensive experimental results on the standard benchmark NLP tasks, where our Bayesian multi-task transfer learning approach outperforms the state-of-the-art methods in many settings. Furthermore, our approach requires no auxiliary models other than the prompt itself, achieving high degree of parameter-efficiency.
Anthology ID:
2023.findings-emnlp.329
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4942–4958
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.329
DOI:
10.18653/v1/2023.findings-emnlp.329
Bibkey:
Cite (ACL):
Haeju Lee, Minchan Jeong, Se-Young Yun, and Kee-Eung Kim. 2023. Bayesian Multi-Task Transfer Learning for Soft Prompt Tuning. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 4942–4958, Singapore. Association for Computational Linguistics.
Cite (Informal):
Bayesian Multi-Task Transfer Learning for Soft Prompt Tuning (Lee et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.329.pdf