Jianxin Ren
2022
The AISP-SJTU Translation System for WMT 2022
Guangfeng Liu
|
Qinpei Zhu
|
Xingyu Chen
|
Renjie Feng
|
Jianxin Ren
|
Renshou Wu
|
Qingliang Miao
|
Rui Wang
|
Kai Yu
Proceedings of the Seventh Conference on Machine Translation (WMT)
This paper describes AISP-SJTU’s participation in WMT 2022 shared general MT task. In this shared task, we participated in four translation directions: English-Chinese, Chinese-English, English-Japanese and Japanese-English. Our systems are based on the Transformer architecture with several novel and effective variants, including network depth and internal structure. In our experiments, we employ data filtering, large-scale back-translation, knowledge distillation, forward-translation, iterative in-domain knowledge finetune and model ensemble. The constrained systems achieve 48.8, 29.7, 39.3 and 22.0 case-sensitive BLEU scores on EN-ZH, ZH-EN, EN-JA and JA-EN, respectively.
Search
Co-authors
- Guangfeng Liu 1
- Qinpei Zhu 1
- Xingyu Chen 1
- Renjie Feng 1
- Renshou Wu 1
- show all...
Venues
- wmt1