Multilingualism Encourages Recursion: a Transfer Study with mBERT

Andrea De Varda, Roberto Zamparelli


Abstract
The present work constitutes an attempt to investigate the relational structures learnt by mBERT, a multilingual transformer-based network, with respect to different cross-linguistic regularities proposed in the fields of theoretical and quantitative linguistics. We pursued this objective by relying on a zero-shot transfer experiment, evaluating the model’s ability to generalize its native task to artificial languages that could either respect or violate some proposed language universal, and comparing its performance to the output of BERT, a monolingual model with an identical configuration. We created four artificial corpora through a Probabilistic Context-Free Grammar by manipulating the distribution of tokens and the structure of their dependency relations. We showed that while both models were favoured by a Zipfian distribution of the tokens and by the presence of head-dependency type structures, the multilingual transformer network exhibited a stronger reliance on hierarchical cues compared to its monolingual counterpart.
Anthology ID:
2022.sigtyp-1.1
Volume:
Proceedings of the 4th Workshop on Research in Computational Linguistic Typology and Multilingual NLP
Month:
July
Year:
2022
Address:
Seattle, Washington
Editors:
Ekaterina Vylomova, Edoardo Ponti, Ryan Cotterell
Venue:
SIGTYP
SIG:
SIGTYP
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–10
Language:
URL:
https://aclanthology.org/2022.sigtyp-1.1
DOI:
10.18653/v1/2022.sigtyp-1.1
Bibkey:
Cite (ACL):
Andrea De Varda and Roberto Zamparelli. 2022. Multilingualism Encourages Recursion: a Transfer Study with mBERT. In Proceedings of the 4th Workshop on Research in Computational Linguistic Typology and Multilingual NLP, pages 1–10, Seattle, Washington. Association for Computational Linguistics.
Cite (Informal):
Multilingualism Encourages Recursion: a Transfer Study with mBERT (De Varda & Zamparelli, SIGTYP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.sigtyp-1.1.pdf
Video:
 https://aclanthology.org/2022.sigtyp-1.1.mp4