CoDoNMT: Modeling Cohesion Devices for Document-Level Neural Machine Translation

Yikun Lei, Yuqi Ren, Deyi Xiong


Abstract
Cohesion devices, e.g., reiteration, coreference, are crucial for building cohesion links across sentences. In this paper, we propose a document-level neural machine translation framework, CoDoNMT, which models cohesion devices from two perspectives: Cohesion Device Masking (CoDM) and Cohesion Attention Focusing (CoAF). In CoDM, we mask cohesion devices in the current sentence and force NMT to predict them with inter-sentential context information. A prediction task is also introduced to be jointly trained with NMT. In CoAF, we attempt to guide the model to pay exclusive attention to relevant cohesion devices in the context when translating cohesion devices in the current sentence. Such a cohesion attention focusing strategy is softly applied to the self-attention layer. Experiments on three benchmark datasets demonstrate that our approach outperforms state-of-the-art document-level neural machine translation baselines. Further linguistic evaluation validates the effectiveness of the proposed model in producing cohesive translations.
Anthology ID:
2022.coling-1.462
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Editors:
Nicoletta Calzolari, Chu-Ren Huang, Hansaem Kim, James Pustejovsky, Leo Wanner, Key-Sun Choi, Pum-Mo Ryu, Hsin-Hsi Chen, Lucia Donatelli, Heng Ji, Sadao Kurohashi, Patrizia Paggio, Nianwen Xue, Seokhwan Kim, Younggyun Hahm, Zhong He, Tony Kyungil Lee, Enrico Santus, Francis Bond, Seung-Hoon Na
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
5205–5216
Language:
URL:
https://aclanthology.org/2022.coling-1.462
DOI:
Bibkey:
Cite (ACL):
Yikun Lei, Yuqi Ren, and Deyi Xiong. 2022. CoDoNMT: Modeling Cohesion Devices for Document-Level Neural Machine Translation. In Proceedings of the 29th International Conference on Computational Linguistics, pages 5205–5216, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
CoDoNMT: Modeling Cohesion Devices for Document-Level Neural Machine Translation (Lei et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.462.pdf
Code
 codeboy311/codonmt