NAIST-NICT WMT’23 General MT Task Submission

Hiroyuki Deguchi, Kenji Imamura, Yuto Nishida, Yusuke Sakai, Justin Vasselli, Taro Watanabe


Abstract
In this paper, we describe our NAIST-NICT submission to the WMT’23 English ↔ Japanese general machine translation task. Our system generates diverse translation candidates and reranks them using a two-stage reranking system to find the best translation. First, we generated 50 candidates each from 18 translation methods using a variety of techniques to increase the diversity of the translation candidates. We trained seven models per language direction using various combinations of hyperparameters. From these models we used various decoding algorithms, ensembling the models, and using kNN-MT (Khandelwal et al., 2021). We processed the 900 translation candidates through a two-stage reranking system to find the most promising candidate. In the first step, we compared 50 candidates from each translation method using DrNMT (Lee et al., 2021) and returned the candidate with the best score. We ranked the final 18 candidates using COMET-MBR (Fernandes et al., 2022) and returned the best score as the system output. We found that generating diverse translation candidates improved translation quality using the well-designed reranker model.
Anthology ID:
2023.wmt-1.7
Volume:
Proceedings of the Eighth Conference on Machine Translation
Month:
December
Year:
2023
Address:
Singapore
Editors:
Philipp Koehn, Barry Haddow, Tom Kocmi, Christof Monz
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
110–118
Language:
URL:
https://aclanthology.org/2023.wmt-1.7
DOI:
10.18653/v1/2023.wmt-1.7
Bibkey:
Cite (ACL):
Hiroyuki Deguchi, Kenji Imamura, Yuto Nishida, Yusuke Sakai, Justin Vasselli, and Taro Watanabe. 2023. NAIST-NICT WMT’23 General MT Task Submission. In Proceedings of the Eighth Conference on Machine Translation, pages 110–118, Singapore. Association for Computational Linguistics.
Cite (Informal):
NAIST-NICT WMT’23 General MT Task Submission (Deguchi et al., WMT 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.wmt-1.7.pdf