Ask2Transformers: Zero-Shot Domain labelling with Pretrained Language Models

Oscar Sainz, German Rigau


Abstract
In this paper we present a system that exploits different pre-trained Language Models for assigning domain labels to WordNet synsets without any kind of supervision. Furthermore, the system is not restricted to use a particular set of domain labels. We exploit the knowledge encoded within different off-the-shelf pre-trained Language Models and task formulations to infer the domain label of a particular WordNet definition. The proposed zero-shot system achieves a new state-of-the-art on the English dataset used in the evaluation.
Anthology ID:
2021.gwc-1.6
Volume:
Proceedings of the 11th Global Wordnet Conference
Month:
January
Year:
2021
Address:
University of South Africa (UNISA)
Editors:
Piek Vossen, Christiane Fellbaum
Venue:
GWC
SIG:
SIGLEX
Publisher:
Global Wordnet Association
Note:
Pages:
44–52
Language:
URL:
https://aclanthology.org/2021.gwc-1.6
DOI:
Bibkey:
Cite (ACL):
Oscar Sainz and German Rigau. 2021. Ask2Transformers: Zero-Shot Domain labelling with Pretrained Language Models. In Proceedings of the 11th Global Wordnet Conference, pages 44–52, University of South Africa (UNISA). Global Wordnet Association.
Cite (Informal):
Ask2Transformers: Zero-Shot Domain labelling with Pretrained Language Models (Sainz & Rigau, GWC 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.gwc-1.6.pdf
Code
 osainz59/Ask2Transformers
Data
MultiNLI