Fast and Accurate Neural Machine Translation with Translation Memory

Qiuxiang He, Guoping Huang, Qu Cui, Li Li, Lemao Liu


Abstract
It is generally believed that a translation memory (TM) should be beneficial for machine translation tasks. Unfortunately, existing wisdom demonstrates the superiority of TM-based neural machine translation (NMT) only on the TM-specialized translation tasks rather than general tasks, with a non-negligible computational overhead. In this paper, we propose a fast and accurate approach to TM-based NMT within the Transformer framework: the model architecture is simple and employs a single bilingual sentence as its TM, leading to efficient training and inference; and its parameters are effectively optimized through a novel training criterion. Extensive experiments on six TM-specialized tasks show that the proposed approach substantially surpasses several strong baselines that use multiple TMs, in terms of BLEU and running time. In particular, the proposed approach also advances the strong baselines on two general tasks (WMT news Zh->En and En->De).
Anthology ID:
2021.acl-long.246
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
August
Year:
2021
Address:
Online
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3170–3180
Language:
URL:
https://aclanthology.org/2021.acl-long.246
DOI:
10.18653/v1/2021.acl-long.246
Bibkey:
Cite (ACL):
Qiuxiang He, Guoping Huang, Qu Cui, Li Li, and Lemao Liu. 2021. Fast and Accurate Neural Machine Translation with Translation Memory. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 3170–3180, Online. Association for Computational Linguistics.
Cite (Informal):
Fast and Accurate Neural Machine Translation with Translation Memory (He et al., ACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.acl-long.246.pdf