Toucan: Many-to-Many Translation for 150 African Language Pairs

AbdelRahim Elmadany, Ife Adebara, Muhammad Abdul-Mageed


Abstract
We address a notable gap in Natural Language Processing (NLP) by introducing a collection of resources designed to improve Machine Translation (MT) for low-resource languages, with a specific focus on African languages. First, We introduce two language models (LMs), Cheetah-1.2B and Cheetah-3.7B, with 1.2 billion and 3.7 billion parameters respectively. Next, we finetune the aforementioned models to create Toucan, an Afrocentric machine translation model designed to support 156 African language pairs. To evaluate Toucan, we carefully develop an extensive machine translation benchmark, dubbed Afro-Lingu-MT, tailored for evaluating machine translation. Toucan significantly outperforms other models, showcasing its remarkable performance on MT for African languages. Finally, we train a new model, spBLEU-1K, to enhance translation evaluation metrics, covering 1K languages, including African languages. This work aims to advance the field of NLP, fostering cross-cultural understanding and knowledge exchange, particularly in regions with limited language resources such as Africa.
Anthology ID:
2024.findings-acl.781
Volume:
Findings of the Association for Computational Linguistics ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand and virtual meeting
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13189–13206
Language:
URL:
https://aclanthology.org/2024.findings-acl.781
DOI:
Bibkey:
Cite (ACL):
AbdelRahim Elmadany, Ife Adebara, and Muhammad Abdul-Mageed. 2024. Toucan: Many-to-Many Translation for 150 African Language Pairs. In Findings of the Association for Computational Linguistics ACL 2024, pages 13189–13206, Bangkok, Thailand and virtual meeting. Association for Computational Linguistics.
Cite (Informal):
Toucan: Many-to-Many Translation for 150 African Language Pairs (Elmadany et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-acl.781.pdf