Language Model Based Target Token Importance Rescaling for Simultaneous Neural Machine Translation

Aditi Jain, Nishant Kambhatla, Anoop Sarkar


Abstract
The decoder in simultaneous neural machine translation receives limited information from the source while having to balance the opposing requirements of latency versus translation quality. In this paper, we use an auxiliary target-side language model to augment the training of the decoder model. Under this notion of target adaptive training, generating rare or difficult tokens is rewarded which improves the translation quality while reducing latency. The predictions made by a language model in the decoder are combined with the traditional cross entropy loss which frees up the focus on the source side context. Our experimental results over multiple language pairs show that compared to previous state of the art methods in simultaneous translation, we can use an augmented target side context to improve BLEU scores significantly. We show improvements over the state of the art in the low latency range with lower average lagging values (faster output).
Anthology ID:
2023.iwslt-1.32
Volume:
Proceedings of the 20th International Conference on Spoken Language Translation (IWSLT 2023)
Month:
July
Year:
2023
Address:
Toronto, Canada (in-person and online)
Editors:
Elizabeth Salesky, Marcello Federico, Marine Carpuat
Venue:
IWSLT
SIG:
SIGSLT
Publisher:
Association for Computational Linguistics
Note:
Pages:
341–356
Language:
URL:
https://aclanthology.org/2023.iwslt-1.32
DOI:
10.18653/v1/2023.iwslt-1.32
Bibkey:
Cite (ACL):
Aditi Jain, Nishant Kambhatla, and Anoop Sarkar. 2023. Language Model Based Target Token Importance Rescaling for Simultaneous Neural Machine Translation. In Proceedings of the 20th International Conference on Spoken Language Translation (IWSLT 2023), pages 341–356, Toronto, Canada (in-person and online). Association for Computational Linguistics.
Cite (Informal):
Language Model Based Target Token Importance Rescaling for Simultaneous Neural Machine Translation (Jain et al., IWSLT 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.iwslt-1.32.pdf