Exploring Multilingual Pretrained Machine Translation Models for Interactive Translation

Angel Navarro, Francisco Casacuberta


Abstract
Pre-trained large language models (LLM) constitute very important tools in many artificial intelligence applications. In this work, we explore the use of these models in interactive machine translation environments. In particular, we have chosen mBART (multilingual Bidirectional and Auto-Regressive Transformer) as one of these LLMs. The system enables users to refine the translation output interactively by providing feedback. The system utilizes a two-step process, where the NMT (Neural Machine Translation) model generates a preliminary translation in the first step, and the user performs one correction in the second step–repeating the process until the sentence is correctly translated. We assessed the performance of both mBART and the fine-tuned version by comparing them to a state-of-the-art machine translation model on a benchmark dataset regarding user effort, WSR (Word Stroke Ratio), and MAR (Mouse Action Ratio). The experimental results indicate that all the models performed comparably, suggesting that mBART is a viable option for an interactive machine translation environment, as it eliminates the need to train a model from scratch for this particular task. The implications of this finding extend to the development of new machine translation models for interactive environments, as it indicates that novel pre-trained models exhibit state-of-the-art performance in this domain, highlighting the potential benefits of adapting these models to specific needs.
Anthology ID:
2023.mtsummit-users.12
Volume:
Proceedings of Machine Translation Summit XIX, Vol. 2: Users Track
Month:
September
Year:
2023
Address:
Macau SAR, China
Editors:
Masaru Yamada, Felix do Carmo
Venue:
MTSummit
SIG:
Publisher:
Asia-Pacific Association for Machine Translation
Note:
Pages:
132–142
Language:
URL:
https://aclanthology.org/2023.mtsummit-users.12
DOI:
Bibkey:
Cite (ACL):
Angel Navarro and Francisco Casacuberta. 2023. Exploring Multilingual Pretrained Machine Translation Models for Interactive Translation. In Proceedings of Machine Translation Summit XIX, Vol. 2: Users Track, pages 132–142, Macau SAR, China. Asia-Pacific Association for Machine Translation.
Cite (Informal):
Exploring Multilingual Pretrained Machine Translation Models for Interactive Translation (Navarro & Casacuberta, MTSummit 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.mtsummit-users.12.pdf