Language Modelling as a Multi-Task Problem

Lucas Weber, Jaap Jumelet, Elia Bruni, Dieuwke Hupkes


Abstract
In this paper, we propose to study language modelling as a multi-task problem, bringing together three strands of research: multi-task learning, linguistics, and interpretability. Based on hypotheses derived from linguistic theory, we investigate whether language models adhere to learning principles of multi-task learning during training. To showcase the idea, we analyse the generalisation behaviour of language models as they learn the linguistic concept of Negative Polarity Items (NPIs). Our experiments demonstrate that a multi-task setting naturally emerges within the objective of the more general task of language modelling. We argue that this insight is valuable for multi-task learning, linguistics and interpretability research and can lead to exciting new findings in all three domains.
Anthology ID:
2021.eacl-main.176
Volume:
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume
Month:
April
Year:
2021
Address:
Online
Editors:
Paola Merlo, Jorg Tiedemann, Reut Tsarfaty
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2049–2060
Language:
URL:
https://aclanthology.org/2021.eacl-main.176
DOI:
10.18653/v1/2021.eacl-main.176
Bibkey:
Cite (ACL):
Lucas Weber, Jaap Jumelet, Elia Bruni, and Dieuwke Hupkes. 2021. Language Modelling as a Multi-Task Problem. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume, pages 2049–2060, Online. Association for Computational Linguistics.
Cite (Informal):
Language Modelling as a Multi-Task Problem (Weber et al., EACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.eacl-main.176.pdf