Regularized Context Gates on Transformer for Machine Translation

Xintong Li, Lemao Liu, Rui Wang, Guoping Huang, Max Meng


Abstract
Context gates are effective to control the contributions from the source and target contexts in the recurrent neural network (RNN) based neural machine translation (NMT). However, it is challenging to extend them into the advanced Transformer architecture, which is more complicated than RNN. This paper first provides a method to identify source and target contexts and then introduce a gate mechanism to control the source and target contributions in Transformer. In addition, to further reduce the bias problem in the gate mechanism, this paper proposes a regularization method to guide the learning of the gates with supervision automatically generated using pointwise mutual information. Extensive experiments on 4 translation datasets demonstrate that the proposed model obtains an averaged gain of 1.0 BLEU score over a strong Transformer baseline.
Anthology ID:
2020.acl-main.757
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8555–8562
Language:
URL:
https://aclanthology.org/2020.acl-main.757
DOI:
10.18653/v1/2020.acl-main.757
Bibkey:
Cite (ACL):
Xintong Li, Lemao Liu, Rui Wang, Guoping Huang, and Max Meng. 2020. Regularized Context Gates on Transformer for Machine Translation. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 8555–8562, Online. Association for Computational Linguistics.
Cite (Informal):
Regularized Context Gates on Transformer for Machine Translation (Li et al., ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-main.757.pdf
Video:
 http://slideslive.com/38928846