Sequence Shortening for Context-Aware Machine Translation

Paweł Maka, Yusuf Semerci, Jan Scholtes, Gerasimos Spanakis


Abstract
Context-aware Machine Translation aims to improve translations of sentences by incorporating surrounding sentences as context. Towards this task, two main architectures have been applied, namely single-encoder (based on concatenation) and multi-encoder models. In this study, we show that a special case of multi-encoder architecture, where the latent representation of the source sentence is cached and reused as the context in the next step, achieves higher accuracy on the contrastive datasets (where the models have to rank the correct translation among the provided sentences) and comparable BLEU and COMET scores as the single- and multi-encoder approaches. Furthermore, we investigate the application of Sequence Shortening to the cached representations. We test three pooling-based shortening techniques and introduce two novel methods - Latent Grouping and Latent Selecting, where the network learns to group tokens or selects the tokens to be cached as context. Our experiments show that the two methods achieve competitive BLEU and COMET scores and accuracies on the contrastive datasets to the other tested methods while potentially allowing for higher interpretability and reducing the growth of memory requirements with increased context size.
Anthology ID:
2024.findings-eacl.127
Volume:
Findings of the Association for Computational Linguistics: EACL 2024
Month:
March
Year:
2024
Address:
St. Julian’s, Malta
Editors:
Yvette Graham, Matthew Purver
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1874–1894
Language:
URL:
https://aclanthology.org/2024.findings-eacl.127
DOI:
Bibkey:
Cite (ACL):
Paweł Maka, Yusuf Semerci, Jan Scholtes, and Gerasimos Spanakis. 2024. Sequence Shortening for Context-Aware Machine Translation. In Findings of the Association for Computational Linguistics: EACL 2024, pages 1874–1894, St. Julian’s, Malta. Association for Computational Linguistics.
Cite (Informal):
Sequence Shortening for Context-Aware Machine Translation (Maka et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-eacl.127.pdf
Video:
 https://aclanthology.org/2024.findings-eacl.127.mp4