Neural Machine Translation with Contrastive Translation Memories

Xin Cheng, Shen Gao, Lemao Liu, Dongyan Zhao, Rui Yan


Abstract
Retrieval-augmented Neural Machine Translation models have been successful in many translation scenarios. Different from previous works that make use of mutually similar but redundant translation memories (TMs), we propose a new retrieval-augmented NMT to model contrastively retrieved translation memories that are holistically similar to the source sentence while individually contrastive to each other providing maximal information gain in three phases. First, in TM retrieval phase, we adopt contrastive retrieval algorithm to avoid redundancy and uninformativeness of similar translation pieces. Second, in memory encoding stage, given a set of TMs we propose a novel Hierarchical Group Attention module to gather both local context of each TM and global context of the whole TM set. Finally, in training phase, a Multi-TM contrastive learning objective is introduced to learn salient feature of each TM with respect to target sentence. Experimental results show that our framework obtains substantial improvements over strong baselines in the benchmark dataset.
Anthology ID:
2022.emnlp-main.235
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3591–3601
Language:
URL:
https://aclanthology.org/2022.emnlp-main.235
DOI:
10.18653/v1/2022.emnlp-main.235
Bibkey:
Cite (ACL):
Xin Cheng, Shen Gao, Lemao Liu, Dongyan Zhao, and Rui Yan. 2022. Neural Machine Translation with Contrastive Translation Memories. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 3591–3601, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Neural Machine Translation with Contrastive Translation Memories (Cheng et al., EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-main.235.pdf