Adaptor: Objective-Centric Adaptation Framework for Language Models

Michal Štefánik, Vít Novotný, Nikola Groverová, Petr Sojka


Abstract
This paper introduces Adaptor library, which transposes traditional model-centric approach composed of pre-training + fine-tuning steps to objective-centric approach, composing the training process by applications of selected objectives. We survey research directions that can benefit from enhanced objective-centric experimentation in multitask training, custom objectives development, dynamic training curricula, or domain adaptation. Adaptor aims to ease reproducibility of these research directions in practice. Finally, we demonstrate the practical applicability of Adaptor in selected unsupervised domain adaptation scenarios.
Anthology ID:
2022.acl-demo.26
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics: System Demonstrations
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Valerio Basile, Zornitsa Kozareva, Sanja Stajner
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
261–269
Language:
URL:
https://aclanthology.org/2022.acl-demo.26
DOI:
10.18653/v1/2022.acl-demo.26
Bibkey:
Cite (ACL):
Michal Štefánik, Vít Novotný, Nikola Groverová, and Petr Sojka. 2022. Adaptor: Objective-Centric Adaptation Framework for Language Models. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics: System Demonstrations, pages 261–269, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Adaptor: Objective-Centric Adaptation Framework for Language Models (Štefánik et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-demo.26.pdf
Video:
 https://aclanthology.org/2022.acl-demo.26.mp4
Code
 gaussalgo/adaptor