Encoding Sentence Position in Context-Aware Neural Machine Translation with Concatenation

Lorenzo Lupo, Marco Dinarelli, Laurent Besacier


Abstract
Context-aware translation can be achieved by processing a concatenation of consecutive sentences with the standard Transformer architecture. This paper investigates the intuitive idea of providing the model with explicit information about the position of the sentences contained in the concatenation window. We compare various methods to encode sentence positions into token representations, including novel methods. Our results show that the Transformer benefits from certain sentence position encoding methods on English to Russian translation, if trained with a context-discounted loss. However, the same benefits are not observed on English to German. Further empirical efforts are necessary to define the conditions under which the proposed approach is beneficial.
Anthology ID:
2023.insights-1.4
Volume:
The Fourth Workshop on Insights from Negative Results in NLP
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Shabnam Tafreshi, Arjun Akula, João Sedoc, Aleksandr Drozd, Anna Rogers, Anna Rumshisky
Venue:
insights
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
33–44
Language:
URL:
https://aclanthology.org/2023.insights-1.4
DOI:
10.18653/v1/2023.insights-1.4
Bibkey:
Cite (ACL):
Lorenzo Lupo, Marco Dinarelli, and Laurent Besacier. 2023. Encoding Sentence Position in Context-Aware Neural Machine Translation with Concatenation. In The Fourth Workshop on Insights from Negative Results in NLP, pages 33–44, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Encoding Sentence Position in Context-Aware Neural Machine Translation with Concatenation (Lupo et al., insights 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.insights-1.4.pdf
Video:
 https://aclanthology.org/2023.insights-1.4.mp4