Does Transliteration Help Multilingual Language Modeling?

Ibraheem Muhammad Moosa, Mahmud Elahi Akhter, Ashfia Binte Habib


Abstract
Script diversity presents a challenge to Multilingual Language Models (MLLM) by reducing lexical overlap among closely related languages. Therefore, transliterating closely related languages that use different writing scripts to a common script may improve the downstream task performance of MLLMs. We empirically measure the effect of transliteration on MLLMs in this context. We specifically focus on the Indic languages, which have the highest script diversity in the world, and we evaluate our models on the IndicGLUE benchmark. We perform the Mann-Whitney U test to rigorously verify whether the effect of transliteration is significant or not. We find that transliteration benefits the low-resource languages without negatively affecting the comparatively high-resource languages. We also measure the cross-lingual representation similarity of the models using centered kernel alignment on parallel sentences from the FLORES-101 dataset. We find that for parallel sentences across different languages, the transliteration-based model learns sentence representations that are more similar.
Anthology ID:
2023.findings-eacl.50
Volume:
Findings of the Association for Computational Linguistics: EACL 2023
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
670–685
Language:
URL:
https://aclanthology.org/2023.findings-eacl.50
DOI:
10.18653/v1/2023.findings-eacl.50
Bibkey:
Cite (ACL):
Ibraheem Muhammad Moosa, Mahmud Elahi Akhter, and Ashfia Binte Habib. 2023. Does Transliteration Help Multilingual Language Modeling?. In Findings of the Association for Computational Linguistics: EACL 2023, pages 670–685, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Does Transliteration Help Multilingual Language Modeling? (Moosa et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-eacl.50.pdf
Video:
 https://aclanthology.org/2023.findings-eacl.50.mp4