DLUT and GTCOM’s Neural Machine Translation Systems for WMT24

Hao Zong, Chao Bei, Huan Liu, Conghu Yuan, Wentao Chen, Degen Huang


Abstract
This paper presents the submission from Global Tone Communication Co., Ltd. and Dalian University of Technology for the WMT24 shared general Machine Translation (MT) task at the Conference on Empirical Methods in Natural Language Processing (EMNLP). Our participation encompasses two language pairs: English to Japanese and Japanese to Chinese. The systems are developed without particular constraints or requirements, facilitating extensive research in machine translation. We emphasize back-translation, utilize multilingual translation models, and apply fine-tuning strategies to improve performance. Additionally, we integrate both human-generated and machine-generated data to fine-tune our models, leading to enhanced translation accuracy. The automatic evaluation results indicate that our system ranks first in terms of BLEU score for the Japanese to Chinese translation.
Anthology ID:
2024.wmt-1.15
Volume:
Proceedings of the Ninth Conference on Machine Translation
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Barry Haddow, Tom Kocmi, Philipp Koehn, Christof Monz
Venue:
WMT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
227–231
Language:
URL:
https://aclanthology.org/2024.wmt-1.15
DOI:
10.18653/v1/2024.wmt-1.15
Bibkey:
Cite (ACL):
Hao Zong, Chao Bei, Huan Liu, Conghu Yuan, Wentao Chen, and Degen Huang. 2024. DLUT and GTCOM’s Neural Machine Translation Systems for WMT24. In Proceedings of the Ninth Conference on Machine Translation, pages 227–231, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
DLUT and GTCOM’s Neural Machine Translation Systems for WMT24 (Zong et al., WMT 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.wmt-1.15.pdf