Junxuan Chen
2020
Modeling Discourse Structure for Document-level Neural Machine Translation
Junxuan Chen
|
Xiang Li
|
Jiarui Zhang
|
Chulun Zhou
|
Jianwei Cui
|
Bin Wang
|
Jinsong Su
Proceedings of the First Workshop on Automatic Simultaneous Translation
Recently, document-level neural machine translation (NMT) has become a hot topic in the community of machine translation. Despite its success, most of existing studies ignored the discourse structure information of the input document to be translated, which has shown effective in other tasks. In this paper, we propose to improve document-level NMT with the aid of discourse structure information. Our encoder is based on a hierarchical attention network (HAN) (Miculicich et al., 2018). Specifically, we first parse the input document to obtain its discourse structure. Then, we introduce a Transformer-based path encoder to embed the discourse structure information of each word. Finally, we combine the discourse structure information with the word embedding before it is fed into the encoder. Experimental results on the English-to-German dataset show that our model can significantly outperform both Transformer and Transformer+HAN.
Search
Co-authors
- Xiang Li 1
- Jiarui Zhang 1
- Chulun Zhou 1
- Jianwei Cui 1
- Bin Wang 1
- show all...