ICT’s System for AutoSimTrans 2021: Robust Char-Level Simultaneous Translation

Shaolei Zhang, Yang Feng


Abstract
Simultaneous translation (ST) outputs the translation simultaneously while reading the input sentence, which is an important component of simultaneous interpretation. In this paper, we describe our submitted ST system, which won the first place in the streaming transcription input track of the Chinese-English translation task of AutoSimTrans 2021. Aiming at the robustness of ST, we first propose char-level simultaneous translation and applied wait-k policy on it. Meanwhile, we apply two data processing methods and combine two training methods for domain adaptation. Our method enhance the ST model with stronger robustness and domain adaptability. Experiments on streaming transcription show that our method outperforms the baseline at all latency, especially at low latency, the proposed method improves about 6 BLEU. Besides, ablation studies we conduct verify the effectiveness of each module in the proposed method.
Anthology ID:
2021.autosimtrans-1.1
Volume:
Proceedings of the Second Workshop on Automatic Simultaneous Translation
Month:
June
Year:
2021
Address:
Online
Editors:
Hua Wu, Colin Cherry, Liang Huang, Zhongjun He, Qun Liu, Maha Elbayad, Mark Liberman, Haifeng Wang, Mingbo Ma, Ruiqing Zhang
Venue:
AutoSimTrans
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–11
Language:
URL:
https://aclanthology.org/2021.autosimtrans-1.1
DOI:
10.18653/v1/2021.autosimtrans-1.1
Bibkey:
Cite (ACL):
Shaolei Zhang and Yang Feng. 2021. ICT’s System for AutoSimTrans 2021: Robust Char-Level Simultaneous Translation. In Proceedings of the Second Workshop on Automatic Simultaneous Translation, pages 1–11, Online. Association for Computational Linguistics.
Cite (Informal):
ICT’s System for AutoSimTrans 2021: Robust Char-Level Simultaneous Translation (Zhang & Feng, AutoSimTrans 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.autosimtrans-1.1.pdf