Pivot Translation for Zero-resource Language Pairs Based on a Multilingual Pretrained Model

Kenji Imamura, Masao Utiyama, Eiichiro Sumita


Abstract
A multilingual translation model enables a single model to handle multiple languages. However, the translation qualities of unlearned language pairs (i.e., zero-shot translation qualities) are still poor. By contrast, pivot translation translates source texts into target ones via a pivot language such as English, thus enabling machine translation without parallel texts between the source and target languages. In this paper, we perform pivot translation using a multilingual model and compare it with direct translation. We improve the translation quality without using parallel texts of direct translation by fine-tuning the model with machine-translated pseudo-translations. We also discuss what type of parallel texts are suitable for effectively improving the translation quality in multilingual pivot translation.
Anthology ID:
2023.mtsummit-research.29
Volume:
Proceedings of Machine Translation Summit XIX, Vol. 1: Research Track
Month:
September
Year:
2023
Address:
Macau SAR, China
Editors:
Masao Utiyama, Rui Wang
Venue:
MTSummit
SIG:
Publisher:
Asia-Pacific Association for Machine Translation
Note:
Pages:
348–359
Language:
URL:
https://aclanthology.org/2023.mtsummit-research.29
DOI:
Bibkey:
Cite (ACL):
Kenji Imamura, Masao Utiyama, and Eiichiro Sumita. 2023. Pivot Translation for Zero-resource Language Pairs Based on a Multilingual Pretrained Model. In Proceedings of Machine Translation Summit XIX, Vol. 1: Research Track, pages 348–359, Macau SAR, China. Asia-Pacific Association for Machine Translation.
Cite (Informal):
Pivot Translation for Zero-resource Language Pairs Based on a Multilingual Pretrained Model (Imamura et al., MTSummit 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.mtsummit-research.29.pdf