Factual Consistency of Multilingual Pretrained Language Models

Constanza Fierro, Anders Søgaard


Abstract
Pretrained language models can be queried for factual knowledge, with potential applications in knowledge base acquisition and tasks that require inference. However, for that, we need to know how reliable this knowledge is, and recent work has shown that monolingual English language models lack consistency when predicting factual knowledge, that is, they fill-in-the-blank differently for paraphrases describing the same fact. In this paper, we extend the analysis of consistency to a multilingual setting. We introduce a resource, mParaRel, and investigate (i) whether multilingual language models such as mBERT and XLM-R are more consistent than their monolingual counterparts;and (ii) if such models are equally consistent across languages. We find that mBERT is as inconsistent as English BERT in English paraphrases, but that both mBERT and XLM-R exhibit a high degree of inconsistency in English and even more so for all the other 45 languages.
Anthology ID:
2022.findings-acl.240
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3046–3052
Language:
URL:
https://aclanthology.org/2022.findings-acl.240
DOI:
10.18653/v1/2022.findings-acl.240
Bibkey:
Cite (ACL):
Constanza Fierro and Anders Søgaard. 2022. Factual Consistency of Multilingual Pretrained Language Models. In Findings of the Association for Computational Linguistics: ACL 2022, pages 3046–3052, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Factual Consistency of Multilingual Pretrained Language Models (Fierro & Søgaard, Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-acl.240.pdf
Software:
 2022.findings-acl.240.software.zip
Video:
 https://aclanthology.org/2022.findings-acl.240.mp4
Code
 coastalcph/mpararel
Data
LAMA