Integrating Vectorized Lexical Constraints for Neural Machine Translation

Shuo Wang, Zhixing Tan, Yang Liu


Abstract
Lexically constrained neural machine translation (NMT), which controls the generation of NMT models with pre-specified constraints, is important in many practical scenarios. Due to the representation gap between discrete constraints and continuous vectors in NMT models, most existing works choose to construct synthetic data or modify the decoding algorithm to impose lexical constraints, treating the NMT model as a black box. In this work, we propose to open this black box by directly integrating the constraints into NMT models. Specifically, we vectorize source and target constraints into continuous keys and values, which can be utilized by the attention modules of NMT models. The proposed integration method is based on the assumption that the correspondence between keys and values in attention modules is naturally suitable for modeling constraint pairs. Experimental results show that our method consistently outperforms several representative baselines on four language pairs, demonstrating the superiority of integrating vectorized lexical constraints.
Anthology ID:
2022.acl-long.487
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7063–7073
Language:
URL:
https://aclanthology.org/2022.acl-long.487
DOI:
10.18653/v1/2022.acl-long.487
Bibkey:
Cite (ACL):
Shuo Wang, Zhixing Tan, and Yang Liu. 2022. Integrating Vectorized Lexical Constraints for Neural Machine Translation. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 7063–7073, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Integrating Vectorized Lexical Constraints for Neural Machine Translation (Wang et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.487.pdf
Code
 shuo-git/vecconstnmt