Jonas Brossmann
2022
AdapterHub Playground: Simple and Flexible Few-Shot Learning with Adapters
Tilman Beck
|
Bela Bohlender
|
Christina Viehmann
|
Vincent Hane
|
Yanik Adamson
|
Jaber Khuri
|
Jonas Brossmann
|
Jonas Pfeiffer
|
Iryna Gurevych
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics: System Demonstrations
The open-access dissemination of pretrained language models through online repositories has led to a democratization of state-of-the-art natural language processing (NLP) research. This also allows people outside of NLP to use such models and adapt them to specific use-cases. However, a certain amount of technical proficiency is still required which is an entry barrier for users who want to apply these models to a certain task but lack the necessary knowledge or resources. In this work, we aim to overcome this gap by providing a tool which allows researchers to leverage pretrained models without writing a single line of code. Built upon the parameter-efficient adapter modules for transfer learning, our AdapterHub Playground provides an intuitive interface, allowing the usage of adapters for prediction, training and analysis of textual data for a variety of NLP tasks. We present the tool’s architecture and demonstrate its advantages with prototypical use-cases, where we show that predictive performance can easily be increased in a few-shot learning scenario. Finally, we evaluate its usability in a user study. We provide the code and a live interface at https://adapter-hub.github.io/playground.
Search
Co-authors
- Tilman Beck 1
- Bela Bohlender 1
- Christina Viehmann 1
- Vincent Hane 1
- Yanik Adamson 1
- show all...
Venues
- acl1