Context Consistency between Training and Inference in Simultaneous Machine Translation

Meizhi Zhong, Lemao Liu, Kehai Chen, Mingming Yang, Min Zhang


Abstract
Simultaneous Machine Translation (SiMT) aims to yield a real-time partial translation with a monotonically growing source-side context.However, there is a counterintuitive phenomenon about the context usage between training and inference: *e.g.*, in wait-k inference, model consistently trained with wait-k is much worse than that model inconsistently trained with wait-k' (k'≠ k) in terms of translation quality. To this end, we first investigate the underlying reasons behind this phenomenon and uncover the following two factors: 1) the limited correlation between translation quality and training loss; 2) exposure bias between training and inference. Based on both reasons, we then propose an effective training approach called context consistency training accordingly, which encourages consistent context usage between training and inference by optimizing translation quality and latency as bi-objectives and exposing the predictions to the model during the training. The experiments on three language pairs demonstrate that our SiMT system encouraging context consistency outperforms existing SiMT systems with context inconsistency for the first time.
Anthology ID:
2024.acl-long.727
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13465–13476
Language:
URL:
https://aclanthology.org/2024.acl-long.727
DOI:
Bibkey:
Cite (ACL):
Meizhi Zhong, Lemao Liu, Kehai Chen, Mingming Yang, and Min Zhang. 2024. Context Consistency between Training and Inference in Simultaneous Machine Translation. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 13465–13476, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Context Consistency between Training and Inference in Simultaneous Machine Translation (Zhong et al., ACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.acl-long.727.pdf