DUTNLP System for the WMT2023 Discourse-Level Literary Translation

Anqi Zhao, Kaiyu Huang, Hao Yu, Degen Huang


Abstract
This paper describes the submission of DUTNLP Lab submission to WMT23 Discourse-Level Literary Translation in the Chinese to English translation direction under unconstrained conditions. Our primary system aims to leverage a large language model with various prompt strategies, which can fully investigate the potential capabilities of large language models for discourse-level neural machine translation. Moreover, we test a widely used discourse-level machine translation model, G-transformer, with different training strategies. In our experimental results, the method with large language models achieves a BLEU score of 28.16, while the fine-tuned method scores 25.26. These findings indicate that selecting appropriate prompt strategies based on large language models can significantly improve translation performance compared to traditional model training methods.
Anthology ID:
2023.wmt-1.31
Volume:
Proceedings of the Eighth Conference on Machine Translation
Month:
December
Year:
2023
Address:
Singapore
Editors:
Philipp Koehn, Barry Haddow, Tom Kocmi, Christof Monz
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
296–301
Language:
URL:
https://aclanthology.org/2023.wmt-1.31
DOI:
10.18653/v1/2023.wmt-1.31
Bibkey:
Cite (ACL):
Anqi Zhao, Kaiyu Huang, Hao Yu, and Degen Huang. 2023. DUTNLP System for the WMT2023 Discourse-Level Literary Translation. In Proceedings of the Eighth Conference on Machine Translation, pages 296–301, Singapore. Association for Computational Linguistics.
Cite (Informal):
DUTNLP System for the WMT2023 Discourse-Level Literary Translation (Zhao et al., WMT 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.wmt-1.31.pdf