Massive Choice, Ample Tasks (MaChAmp): A Toolkit for Multi-task Learning in NLP

Rob van der Goot, Ahmet Üstün, Alan Ramponi, Ibrahim Sharaf, Barbara Plank


Abstract
Transfer learning, particularly approaches that combine multi-task learning with pre-trained contextualized embeddings and fine-tuning, have advanced the field of Natural Language Processing tremendously in recent years. In this paper we present MaChAmp, a toolkit for easy fine-tuning of contextualized embeddings in multi-task settings. The benefits of MaChAmp are its flexible configuration options, and the support of a variety of natural language processing tasks in a uniform toolkit, from text classification and sequence labeling to dependency parsing, masked language modeling, and text generation.
Anthology ID:
2021.eacl-demos.22
Volume:
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: System Demonstrations
Month:
April
Year:
2021
Address:
Online
Editors:
Dimitra Gkatzia, Djamé Seddah
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
176–197
Language:
URL:
https://aclanthology.org/2021.eacl-demos.22
DOI:
10.18653/v1/2021.eacl-demos.22
Bibkey:
Cite (ACL):
Rob van der Goot, Ahmet Üstün, Alan Ramponi, Ibrahim Sharaf, and Barbara Plank. 2021. Massive Choice, Ample Tasks (MaChAmp): A Toolkit for Multi-task Learning in NLP. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: System Demonstrations, pages 176–197, Online. Association for Computational Linguistics.
Cite (Informal):
Massive Choice, Ample Tasks (MaChAmp): A Toolkit for Multi-task Learning in NLP (van der Goot et al., EACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.eacl-demos.22.pdf
Code
 machamp-nlp/machamp +  additional community code
Data
GLUEMultiNLIQNLISSTUniversal Dependencies