Prior Knowledge Integration for Neural Machine Translation using Posterior Regularization

Jiacheng Zhang, Yang Liu, Huanbo Luan, Jingfang Xu, Maosong Sun


Abstract
Although neural machine translation has made significant progress recently, how to integrate multiple overlapping, arbitrary prior knowledge sources remains a challenge. In this work, we propose to use posterior regularization to provide a general framework for integrating prior knowledge into neural machine translation. We represent prior knowledge sources as features in a log-linear model, which guides the learning processing of the neural translation model. Experiments on Chinese-English dataset show that our approach leads to significant improvements.
Anthology ID:
P17-1139
Volume:
Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2017
Address:
Vancouver, Canada
Editors:
Regina Barzilay, Min-Yen Kan
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1514–1523
Language:
URL:
https://aclanthology.org/P17-1139
DOI:
10.18653/v1/P17-1139
Bibkey:
Cite (ACL):
Jiacheng Zhang, Yang Liu, Huanbo Luan, Jingfang Xu, and Maosong Sun. 2017. Prior Knowledge Integration for Neural Machine Translation using Posterior Regularization. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1514–1523, Vancouver, Canada. Association for Computational Linguistics.
Cite (Informal):
Prior Knowledge Integration for Neural Machine Translation using Posterior Regularization (Zhang et al., ACL 2017)
Copy Citation:
PDF:
https://aclanthology.org/P17-1139.pdf
Code
 Glaceon31/PR4NMT