Translating Similar Languages: Role of Mutual Intelligibility in Multilingual Transformers

Ife Adebara, El Moatez Billah Nagoudi, Muhammad Abdul Mageed


Abstract
In this work we investigate different approaches to translate between similar languages despite low resource limitations. This work is done as the participation of the UBC NLP research group in the WMT 2019 Similar Languages Translation Shared Task. We participated in all language pairs and performed various experiments. We used a transformer architecture for all the models and used back-translation for one of the language pairs. We explore both bilingual and multi-lingual approaches. We describe the pre-processing, training, translation and results for each model. We also investigate the role of mutual intelligibility in model performance.
Anthology ID:
2020.wmt-1.42
Volume:
Proceedings of the Fifth Conference on Machine Translation
Month:
November
Year:
2020
Address:
Online
Editors:
Loïc Barrault, Ondřej Bojar, Fethi Bougares, Rajen Chatterjee, Marta R. Costa-jussà, Christian Federmann, Mark Fishel, Alexander Fraser, Yvette Graham, Paco Guzman, Barry Haddow, Matthias Huck, Antonio Jimeno Yepes, Philipp Koehn, André Martins, Makoto Morishita, Christof Monz, Masaaki Nagata, Toshiaki Nakazawa, Matteo Negri
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
381–386
Language:
URL:
https://aclanthology.org/2020.wmt-1.42
DOI:
Bibkey:
Cite (ACL):
Ife Adebara, El Moatez Billah Nagoudi, and Muhammad Abdul Mageed. 2020. Translating Similar Languages: Role of Mutual Intelligibility in Multilingual Transformers. In Proceedings of the Fifth Conference on Machine Translation, pages 381–386, Online. Association for Computational Linguistics.
Cite (Informal):
Translating Similar Languages: Role of Mutual Intelligibility in Multilingual Transformers (Adebara et al., WMT 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.wmt-1.42.pdf
Video:
 https://slideslive.com/38939640