Cai Jie


2022

pdf bib
Transn’s Submissions to the WMT22 Translation Suggestion Task
Mao Hongbao | Zhang Wenbo | Cai Jie | Cheng Jianwei
Proceedings of the Seventh Conference on Machine Translation (WMT)

This paper describes the Transn’s submissions to the WMT2022 shared task on TranslationSuggestion. Our team participated on two tasks: Naive Translation Suggestion and TranslationSuggestion with Hints, focusing on two language directions Zh→En and En→Zh. Apart from the golden training data provided by the shared task, we utilized synthetic corpus to fine-tune on DeltaLM (∆LM), which is a pre-trained encoder-decoder language model. We applied two-stage training strategy on ∆LM and several effective methods to generate synthetic corpus, which contribute a lot to the results. According to the official evaluation results in terms of BLEU scores, our submissions in Naive Translation Suggestion En→Zh and Translation Suggestion with Hints (both Zh→En and En→Zh) ranked 1st, and Naive Translation Suggestion Zh→En also achieved comparable result to the best score.