Learn To Remember: Transformer with Recurrent Memory for Document-Level Machine Translation

Yukun Feng, Feng Li, Ziang Song, Boyuan Zheng, Philipp Koehn


Abstract
The Transformer architecture has led to significant gains in machine translation. However, most studies focus on only sentence-level translation without considering the context dependency within documents, leading to the inadequacy of document-level coherence. Some recent research tried to mitigate this issue by introducing an additional context encoder or translating with multiple sentences or even the entire document. Such methods may lose the information on the target side or have an increasing computational complexity as documents get longer. To address such problems, we introduce a recurrent memory unit to the vanilla Transformer, which supports the information exchange between the sentence and previous context. The memory unit is recurrently updated by acquiring information from sentences, and passing the aggregated knowledge back to subsequent sentence states. We follow a two-stage training strategy, in which the model is first trained at the sentence level and then finetuned for document-level translation. We conduct experiments on three popular datasets for document-level machine translation and our model has an average improvement of 0.91 s-BLEU over the sentence-level baseline. We also achieve state-of-the-art results on TED and News, outperforming the previous work by 0.36 s-BLEU and 1.49 d-BLEU on average.
Anthology ID:
2022.findings-naacl.105
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1409–1420
Language:
URL:
https://aclanthology.org/2022.findings-naacl.105
DOI:
10.18653/v1/2022.findings-naacl.105
Bibkey:
Cite (ACL):
Yukun Feng, Feng Li, Ziang Song, Boyuan Zheng, and Philipp Koehn. 2022. Learn To Remember: Transformer with Recurrent Memory for Document-Level Machine Translation. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 1409–1420, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Learn To Remember: Transformer with Recurrent Memory for Document-Level Machine Translation (Feng et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-naacl.105.pdf
Software:
 2022.findings-naacl.105.software.zip
Data
Europarl