Learning to Predict Task Transferability via Soft Prompt

Lingyun Feng


Abstract
Fine-tuning pretrained language models on helpful intermediate tasks often greatly improves the performance of target tasks. However, how to efficiently find the source tasks that can successfully transfer still remains under-explored. In this work, we propose to learn an affinity scoring function to predict transferability between tasks. Specifically, we conduct prompt tuning and regard soft prompts as task embeddings that summarize task-specific information. Then we randomly sample task pairs to train an affinity scoring function. The goal is to predict the transfer gain (i.e., affinity) between a task pair, by conditioning on their task embeddings. Once the scoring function is trained, given a novel target task, we use it to predict the most transferable source tasks, without a brute-force search for all possible source-target pairs. Experimental results across 50 tasks show that our method efficiently identifies beneficial tasks for transfer learning.
Anthology ID:
2023.emnlp-main.546
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8829–8844
Language:
URL:
https://aclanthology.org/2023.emnlp-main.546
DOI:
10.18653/v1/2023.emnlp-main.546
Bibkey:
Cite (ACL):
Lingyun Feng. 2023. Learning to Predict Task Transferability via Soft Prompt. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 8829–8844, Singapore. Association for Computational Linguistics.
Cite (Informal):
Learning to Predict Task Transferability via Soft Prompt (Feng, EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.546.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.546.mp4