Modeling Consistency Preference via Lexical Chains for Document-level Neural Machine Translation

Xinglin Lyu, Junhui Li, Shimin Tao, Hao Yang, Ying Qin, Min Zhang


Abstract
In this paper we aim to relieve the issue of lexical translation inconsistency for document-level neural machine translation (NMT) by modeling consistency preference for lexical chains, which consist of repeated words in a source-side document and provide a representation of the lexical consistency structure of the document. Specifically, we first propose lexical-consistency attention to capture consistency context among words in the same lexical chains. Then for each lexical chain we define and learn a consistency-tailored latent variable, which will guide the translation of corresponding sentences to enhance lexical translation consistency. Experimental results on Chinese→English and French→English document-level translation tasks show that our approach not only significantly improves translation performance in BLEU, but also substantially alleviates the problem of the lexical translation inconsistency.
Anthology ID:
2022.emnlp-main.424
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6312–6326
Language:
URL:
https://aclanthology.org/2022.emnlp-main.424
DOI:
10.18653/v1/2022.emnlp-main.424
Bibkey:
Cite (ACL):
Xinglin Lyu, Junhui Li, Shimin Tao, Hao Yang, Ying Qin, and Min Zhang. 2022. Modeling Consistency Preference via Lexical Chains for Document-level Neural Machine Translation. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 6312–6326, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Modeling Consistency Preference via Lexical Chains for Document-level Neural Machine Translation (Lyu et al., EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-main.424.pdf