DEEP: DEnoising Entity Pre-training for Neural Machine Translation

Junjie Hu, Hiroaki Hayashi, Kyunghyun Cho, Graham Neubig


Abstract
It has been shown that machine translation models usually generate poor translations for named entities that are infrequent in the training corpus. Earlier named entity translation methods mainly focus on phonetic transliteration, which ignores the sentence context for translation and is limited in domain and language coverage. To address this limitation, we propose DEEP, a DEnoising Entity Pre-training method that leverages large amounts of monolingual data and a knowledge base to improve named entity translation accuracy within sentences. Besides, we investigate a multi-task learning strategy that finetunes a pre-trained neural machine translation model on both entity-augmented monolingual data and parallel data to further improve entity translation. Experimental results on three language pairs demonstrate that DEEP results in significant improvements over strong denoising auto-encoding baselines, with a gain of up to 1.3 BLEU and up to 9.2 entity accuracy points for English-Russian translation.
Anthology ID:
2022.acl-long.123
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1753–1766
Language:
URL:
https://aclanthology.org/2022.acl-long.123
DOI:
10.18653/v1/2022.acl-long.123
Bibkey:
Cite (ACL):
Junjie Hu, Hiroaki Hayashi, Kyunghyun Cho, and Graham Neubig. 2022. DEEP: DEnoising Entity Pre-training for Neural Machine Translation. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1753–1766, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
DEEP: DEnoising Entity Pre-training for Neural Machine Translation (Hu et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.123.pdf
Data
ParaCrawl