TaskWeb: Selecting Better Source Tasks for Multi-task NLP

Joongwon Kim, Akari Asai, Gabriel Ilharco, Hannaneh Hajishirzi


Abstract
Recent work in NLP has shown promising results in training models on large amounts of tasks to achieve better generalization. However, it is not well-understood how tasks are related, and how helpful training tasks can be chosen for a new task. In this work, we investigate whether knowing task relationships via pairwise task transfer improves choosing one or more source tasks that help to learn a new target task. We provide TaskWeb, a large-scale benchmark of pairwise task transfers for 22 NLP tasks using three different model types, sizes, and adaptation methods, spanning about 25,000 experiments. Then, we design a new method TaskShop based on our analysis of TaskWeb. TaskShop uses TaskWeb to estimate the benefit of using a source task for learning a new target task, and to choose a subset of helpful training tasks for multi-task training. Our method improves overall rankings and top-k precision of source tasks by 10% and 38%, respectively. We also use TaskShop to build much smaller multi-task training sets that improve zero-shot performances across 11 different target tasks by at least 4.3%.
Anthology ID:
2023.emnlp-main.680
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11032–11052
Language:
URL:
https://aclanthology.org/2023.emnlp-main.680
DOI:
10.18653/v1/2023.emnlp-main.680
Bibkey:
Cite (ACL):
Joongwon Kim, Akari Asai, Gabriel Ilharco, and Hannaneh Hajishirzi. 2023. TaskWeb: Selecting Better Source Tasks for Multi-task NLP. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 11032–11052, Singapore. Association for Computational Linguistics.
Cite (Informal):
TaskWeb: Selecting Better Source Tasks for Multi-task NLP (Kim et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.680.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.680.mp4