Sentence-Level Agreement for Neural Machine Translation

Mingming Yang, Rui Wang, Kehai Chen, Masao Utiyama, Eiichiro Sumita, Min Zhang, Tiejun Zhao


Abstract
The training objective of neural machine translation (NMT) is to minimize the loss between the words in the translated sentences and those in the references. In NMT, there is a natural correspondence between the source sentence and the target sentence. However, this relationship has only been represented using the entire neural network and the training objective is computed in word-level. In this paper, we propose a sentence-level agreement module to directly minimize the difference between the representation of source and target sentence. The proposed agreement module can be integrated into NMT as an additional training objective function and can also be used to enhance the representation of the source sentences. Empirical results on the NIST Chinese-to-English and WMT English-to-German tasks show the proposed agreement module can significantly improve the NMT performance.
Anthology ID:
P19-1296
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3076–3082
Language:
URL:
https://aclanthology.org/P19-1296
DOI:
10.18653/v1/P19-1296
Bibkey:
Cite (ACL):
Mingming Yang, Rui Wang, Kehai Chen, Masao Utiyama, Eiichiro Sumita, Min Zhang, and Tiejun Zhao. 2019. Sentence-Level Agreement for Neural Machine Translation. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 3076–3082, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Sentence-Level Agreement for Neural Machine Translation (Yang et al., ACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/P19-1296.pdf