mLUKE: The Power of Entity Representations in Multilingual Pretrained Language Models

Ryokan Ri, Ikuya Yamada, Yoshimasa Tsuruoka


Abstract
Recent studies have shown that multilingual pretrained language models can be effectively improved with cross-lingual alignment information from Wikipedia entities. However, existing methods only exploit entity information in pretraining and do not explicitly use entities in downstream tasks. In this study, we explore the effectiveness of leveraging entity representations for downstream cross-lingual tasks. We train a multilingual language model with 24 languages with entity representations and showthe model consistently outperforms word-based pretrained models in various cross-lingual transfer tasks. We also analyze the model and the key insight is that incorporating entity representations into the input allows us to extract more language-agnostic features. We also evaluate the model with a multilingual cloze prompt task with the mLAMA dataset. We show that entity-based prompt elicits correct factual knowledge more likely than using only word representations.
Anthology ID:
2022.acl-long.505
Original:
2022.acl-long.505v1
Version 2:
2022.acl-long.505v2
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7316–7330
Language:
URL:
https://aclanthology.org/2022.acl-long.505
DOI:
10.18653/v1/2022.acl-long.505
Bibkey:
Cite (ACL):
Ryokan Ri, Ikuya Yamada, and Yoshimasa Tsuruoka. 2022. mLUKE: The Power of Entity Representations in Multilingual Pretrained Language Models. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 7316–7330, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
mLUKE: The Power of Entity Representations in Multilingual Pretrained Language Models (Ri et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.505.pdf
Video:
 https://aclanthology.org/2022.acl-long.505.mp4
Video:
 https://aclanthology.org/2022.acl-long.505.mp4
Code
 studio-ousia/luke +  additional community code
Data
CoNLL 2003LAMAMLQARELXSQuADXQuAD