Quantifying Information of Tokens for Simple and Flexible Simultaneous Machine Translation

DongHyun Lee, Minkyung Park, Byung-Jun Lee


Abstract
Simultaneous Translation (ST) involves translating with only partial source inputs instead of the entire source inputs, a process that can potentially result in translation quality degradation. Previous approaches to balancing translation quality and latency have demonstrated that it is more efficient and effective to leverage an offline model with a reasonable policy. However, using an offline model also leads to a distribution shift since it is not trained with partial source inputs, and it can be improved by training an additional module that informs us when to translate. In this paper, we propose an Information Quantifier (IQ) that models source and target information to determine whether the offline model has sufficient information for translation, trained with oracle action sequences generated from the offline model. IQ, by quantifying information, helps in formulating a suitable policy for Simultaneous Translation that better generalizes and also allows us to control the trade-off between quality and latency naturally. Experiments on various language pairs show that our proposed model outperforms baselines.
Anthology ID:
2023.conll-1.14
Volume:
Proceedings of the 27th Conference on Computational Natural Language Learning (CoNLL)
Month:
December
Year:
2023
Address:
Singapore
Editors:
Jing Jiang, David Reitter, Shumin Deng
Venue:
CoNLL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
200–210
Language:
URL:
https://aclanthology.org/2023.conll-1.14
DOI:
10.18653/v1/2023.conll-1.14
Bibkey:
Cite (ACL):
DongHyun Lee, Minkyung Park, and Byung-Jun Lee. 2023. Quantifying Information of Tokens for Simple and Flexible Simultaneous Machine Translation. In Proceedings of the 27th Conference on Computational Natural Language Learning (CoNLL), pages 200–210, Singapore. Association for Computational Linguistics.
Cite (Informal):
Quantifying Information of Tokens for Simple and Flexible Simultaneous Machine Translation (Lee et al., CoNLL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.conll-1.14.pdf
Software:
 2023.conll-1.14.Software.zip
Video:
 https://aclanthology.org/2023.conll-1.14.mp4