Jianqiang Zhao
2018
Multi-Domain Neural Machine Translation with Word-Level Domain Context Discrimination
Jiali Zeng
|
Jinsong Su
|
Huating Wen
|
Yang Liu
|
Jun Xie
|
Yongjing Yin
|
Jianqiang Zhao
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
With great practical value, the study of Multi-domain Neural Machine Translation (NMT) mainly focuses on using mixed-domain parallel sentences to construct a unified model that allows translation to switch between different domains. Intuitively, words in a sentence are related to its domain to varying degrees, so that they will exert disparate impacts on the multi-domain NMT modeling. Based on this intuition, in this paper, we devote to distinguishing and exploiting word-level domain contexts for multi-domain NMT. To this end, we jointly model NMT with monolingual attention-based domain classification tasks and improve NMT as follows: 1) Based on the sentence representations produced by a domain classifier and an adversarial domain classifier, we generate two gating vectors and use them to construct domain-specific and domain-shared annotations, for later translation predictions via different attention models; 2) We utilize the attention weights derived from target-side domain classifier to adjust the weights of target words in the training objective, enabling domain-related words to have greater impacts during model training. Experimental results on Chinese-English and English-French multi-domain translation tasks demonstrate the effectiveness of the proposed model. Source codes of this paper are available on Github https://github.com/DeepLearnXMU/WDCNMT.
Search
Co-authors
- Jiali Zeng 1
- Jinsong Su 1
- Huating Wen 1
- Yang Liu 1
- Jun Xie 1
- show all...