Choose the Final Translation from NMT and LLM Hypotheses Using MBR Decoding: HW-TSC’s Submission to the WMT24 General MT Shared Task

Zhanglin Wu, Daimeng Wei, Zongyao Li, Hengchao Shang, Jiaxin Guo, Shaojun Li, Zhiqiang Rao, Yuanchang Luo, Ning Xie, Hao Yang


Abstract
This paper presents the submission of Huawei Translate Services Center (HW-TSC) to the WMT24 general machine translation (MT) shared task, where we participate in the English to Chinese (en→zh) language pair. Similar to previous years’ work, we use training strategies such as regularized dropout, bidirectional training, data diversification, forward translation, back translation, alternated training, curriculum learning, and transductive ensemble learning to train the neural machine translation (NMT) model based on the deep Transformer-big architecture. The difference is that we also use continue pre-training, supervised fine-tuning, and contrastive preference optimization to train the large language model (LLM) based MT model. By using Minimum Bayesian risk (MBR) decoding to select the final translation from multiple hypotheses for NMT and LLM-based MT models, our submission receives competitive results in the final evaluation.
Anthology ID:
2024.wmt-1.9
Volume:
Proceedings of the Ninth Conference on Machine Translation
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Barry Haddow, Tom Kocmi, Philipp Koehn, Christof Monz
Venue:
WMT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
155–164
Language:
URL:
https://aclanthology.org/2024.wmt-1.9
DOI:
Bibkey:
Cite (ACL):
Zhanglin Wu, Daimeng Wei, Zongyao Li, Hengchao Shang, Jiaxin Guo, Shaojun Li, Zhiqiang Rao, Yuanchang Luo, Ning Xie, and Hao Yang. 2024. Choose the Final Translation from NMT and LLM Hypotheses Using MBR Decoding: HW-TSC’s Submission to the WMT24 General MT Shared Task. In Proceedings of the Ninth Conference on Machine Translation, pages 155–164, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Choose the Final Translation from NMT and LLM Hypotheses Using MBR Decoding: HW-TSC’s Submission to the WMT24 General MT Shared Task (Wu et al., WMT 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.wmt-1.9.pdf