Universal Simultaneous Machine Translation with Mixture-of-Experts Wait-k Policy

Shaolei Zhang, Yang Feng


Abstract
Simultaneous machine translation (SiMT) generates translation before reading the entire source sentence and hence it has to trade off between translation quality and latency. To fulfill the requirements of different translation quality and latency in practical applications, the previous methods usually need to train multiple SiMT models for different latency levels, resulting in large computational costs. In this paper, we propose a universal SiMT model with Mixture-of-Experts Wait-k Policy to achieve the best translation quality under arbitrary latency with only one trained model. Specifically, our method employs multi-head attention to accomplish the mixture of experts where each head is treated as a wait-k expert with its own waiting words number, and given a test latency and source inputs, the weights of the experts are accordingly adjusted to produce the best translation. Experiments on three datasets show that our method outperforms all the strong baselines under different latency, including the state-of-the-art adaptive policy.
Anthology ID:
2021.emnlp-main.581
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7306–7317
Language:
URL:
https://aclanthology.org/2021.emnlp-main.581
DOI:
10.18653/v1/2021.emnlp-main.581
Bibkey:
Cite (ACL):
Shaolei Zhang and Yang Feng. 2021. Universal Simultaneous Machine Translation with Mixture-of-Experts Wait-k Policy. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 7306–7317, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Universal Simultaneous Machine Translation with Mixture-of-Experts Wait-k Policy (Zhang & Feng, EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.581.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.581.mp4
Code
 ictnlp/moe-waitk