Simultaneous Machine Translation with Large Language Models

Minghan Wang, Thuy-Trang Vu, Jinming Zhao, Fatemeh Shiri, Ehsan Shareghi, Gholamreza Haffari


Abstract
Real-world simultaneous machine translation (SimulMT) systems face more challenges than just the quality-latency trade-off. They also need to address issues related to robustness with noisy input, processing long contexts, and flexibility for knowledge injection. These challenges demand models with strong language understanding and generation capabilities which may not often equipped by dedicated MT models. In this paper, we investigate the possibility of applying Large Language Models (LLM) to SimulMT tasks by using existing incremental-decoding methods with a newly proposed RALCP algorithm for latency reduction. We conducted experiments using the Llama2-7b-chat model on nine different languages from the MUST-C dataset. The results show that LLM outperforms dedicated MT models in terms of BLEU and LAAL metrics. Further analysis indicates that LLM has advantages in terms of tuning efficiency and robustness. However, it is important to note that the computational cost of LLM remains a significant obstacle to its application in SimulMT.
Anthology ID:
2024.alta-1.7
Volume:
Proceedings of the 22nd Annual Workshop of the Australasian Language Technology Association
Month:
December
Year:
2024
Address:
Canberra, Australia
Editors:
Tim Baldwin, Sergio José Rodríguez Méndez, Nicholas Kuo
Venue:
ALTA
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
89–103
Language:
URL:
https://aclanthology.org/2024.alta-1.7/
DOI:
Bibkey:
Cite (ACL):
Minghan Wang, Thuy-Trang Vu, Jinming Zhao, Fatemeh Shiri, Ehsan Shareghi, and Gholamreza Haffari. 2024. Simultaneous Machine Translation with Large Language Models. In Proceedings of the 22nd Annual Workshop of the Australasian Language Technology Association, pages 89–103, Canberra, Australia. Association for Computational Linguistics.
Cite (Informal):
Simultaneous Machine Translation with Large Language Models (Wang et al., ALTA 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.alta-1.7.pdf