Wait-info Policy: Balancing Source and Target at Information Level for Simultaneous Machine Translation

Shaolei Zhang, Shoutao Guo, Yang Feng


Abstract
Simultaneous machine translation (SiMT) outputs the translation while receiving the source inputs, and hence needs to balance the received source information and translated target information to make a reasonable decision between waiting for inputs or outputting translation. Previous methods always balance source and target information at the token level, either directly waiting for a fixed number of tokens or adjusting the waiting based on the current token. In this paper, we propose a Wait-info Policy to balance source and target at the information level. We first quantify the amount of information contained in each token, named info. Then during simultaneous translation, the decision of waiting or outputting is made based on the comparison results between the total info of previous target outputs and received source inputs. Experiments show that our method outperforms strong baselines under and achieves better balance via the proposed info.
Anthology ID:
2022.findings-emnlp.166
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2249–2263
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.166
DOI:
10.18653/v1/2022.findings-emnlp.166
Bibkey:
Cite (ACL):
Shaolei Zhang, Shoutao Guo, and Yang Feng. 2022. Wait-info Policy: Balancing Source and Target at Information Level for Simultaneous Machine Translation. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 2249–2263, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Wait-info Policy: Balancing Source and Target at Information Level for Simultaneous Machine Translation (Zhang et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-emnlp.166.pdf
Video:
 https://aclanthology.org/2022.findings-emnlp.166.mp4