Achieving State-of-the-Art Multilingual Translation Model with Minimal Data and Parameters

Hui Zeng


Abstract
This is LanguageX (ZengHuiMT)’s submission to WMT 2023 General Machine Translation task for 13 language directions. We initially employ an encoder-decoder model to train on all 13 competition translation directions as our baseline system. Subsequently, we adopt a decoder-only architecture and fine-tune a multilingual language model by partially sampling data from diverse multilingual datasets such as CC100 and WuDaoCorpora. This is further refined using carefully curated high-quality parallel corpora across multiple translation directions to enable the model to perform translation tasks. As per automated evaluation metrics, our model ranks first in the translation directions from English to Russian, English to German, and English to Ukrainian. It secures the second position in the directions from English to Czech, English to Hebrew, Hebrew to English, and Ukrainian to English, and ranks third in German to English, Japanese to English, and Russian to English among all participating teams. Our best-performing model, covering 13 translation directions, stands on par with GPT-4. Among all 13 translation directions, our multilingual model surpasses GPT-4 in bleu scores for 7 translation directions.
Anthology ID:
2023.wmt-1.18
Volume:
Proceedings of the Eighth Conference on Machine Translation
Month:
December
Year:
2023
Address:
Singapore
Editors:
Philipp Koehn, Barry Haddow, Tom Kocmi, Christof Monz
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
181–186
Language:
URL:
https://aclanthology.org/2023.wmt-1.18
DOI:
10.18653/v1/2023.wmt-1.18
Bibkey:
Cite (ACL):
Hui Zeng. 2023. Achieving State-of-the-Art Multilingual Translation Model with Minimal Data and Parameters. In Proceedings of the Eighth Conference on Machine Translation, pages 181–186, Singapore. Association for Computational Linguistics.
Cite (Informal):
Achieving State-of-the-Art Multilingual Translation Model with Minimal Data and Parameters (Zeng, WMT 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.wmt-1.18.pdf