CogALex-VI Shared Task: Transrelation - A Robust Multilingual Language Model for Multilingual Relation Identification

Lennart Wachowiak, Christian Lang, Barbara Heinisch, Dagmar Gromann


Abstract
We describe our submission to the CogALex-VI shared task on the identification of multilingual paradigmatic relations building on XLM-RoBERTa (XLM-R), a robustly optimized and multilingual BERT model. In spite of several experiments with data augmentation, data addition and ensemble methods with a Siamese Triple Net, Translrelation, the XLM-R model with a linear classifier adapted to this specific task, performed best in testing and achieved the best results in the final evaluation of the shared task, even for a previously unseen language.
Anthology ID:
2020.cogalex-1.7
Volume:
Proceedings of the Workshop on the Cognitive Aspects of the Lexicon
Month:
December
Year:
2020
Address:
Online
Venues:
COLING | CogALex
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
59–64
Language:
URL:
https://aclanthology.org/2020.cogalex-1.7
DOI:
Bibkey:
Copy Citation:
PDF:
https://aclanthology.org/2020.cogalex-1.7.pdf