Learning Decoupled Retrieval Representation for Nearest Neighbour Neural Machine Translation

Qiang Wang, Rongxiang Weng, Ming Chen


Abstract
K-Nearest Neighbor Neural Machine Translation (kNNMT) successfully incorporates external corpus by retrieving word-level representations at test time. Generally, kNNMT borrows the off-the-shelf context representation in the translation task, e.g., the output of the last decoder layer, as the query vector of the retrieval task. In this work, we highlight that coupling the representations of these two tasks is sub-optimal for fine-grained retrieval. To alleviate it, we leverage supervised contrastive learning to learn the distinctive retrieval representation derived from the original context representation. We also propose a fast and effective approach to constructing hard negative samples. Experimental results on five domains show that our approach improves the retrieval accuracy and BLEU score compared to vanilla kNNMT.
Anthology ID:
2022.coling-1.456
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
5142–5147
Language:
URL:
https://aclanthology.org/2022.coling-1.456
DOI:
Bibkey:
Cite (ACL):
Qiang Wang, Rongxiang Weng, and Ming Chen. 2022. Learning Decoupled Retrieval Representation for Nearest Neighbour Neural Machine Translation. In Proceedings of the 29th International Conference on Computational Linguistics, pages 5142–5147, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Learning Decoupled Retrieval Representation for Nearest Neighbour Neural Machine Translation (Wang et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.456.pdf