Many-to-English Machine Translation Tools, Data, and Pretrained Models

Thamme Gowda, Zhao Zhang, Chris Mattmann, Jonathan May


Abstract
While there are more than 7000 languages in the world, most translation research efforts have targeted a few high resource languages. Commercial translation systems support only one hundred languages or fewer, and do not make these models available for transfer to low resource languages. In this work, we present useful tools for machine translation research: MTData, NLCodec and RTG. We demonstrate their usefulness by creating a multilingual neural machine translation model capable of translating from 500 source languages to English. We make this multilingual model readily downloadable and usable as a service, or as a parent model for transfer-learning to even lower-resource languages.
Anthology ID:
2021.acl-demo.37
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing: System Demonstrations
Month:
August
Year:
2021
Address:
Online
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
306–316
Language:
URL:
https://aclanthology.org/2021.acl-demo.37
DOI:
10.18653/v1/2021.acl-demo.37
Bibkey:
Copy Citation:
PDF:
https://aclanthology.org/2021.acl-demo.37.pdf