How Far can 100 Samples Go? Unlocking Zero-Shot Translation with Tiny Multi-Parallel Data

Di Wu, Shaomu Tan, Yan Meng, David Stap, Christof Monz


Abstract
Zero-shot translation aims to translate between language pairs not seen during training in Multilingual Machine Translation (MMT) and is widely considered an open problem. A common, albeit resource-consuming, solution is to add as many related translation directions as possible to the training corpus. In this paper, we show that for an English-centric model, surprisingly large zero-shot improvements can be achieved by simply fine-tuning with a very small amount of multi-parallel data. For example, on the EC30 dataset, we obtain up to +21.7 ChrF++ non-English overall improvements (870 directions) by using only 100 multi-parallel samples while preserving English-centric translation quality. This performance exceeds M2M100 by an average of 5.9 ChrF++ in the involved non-English directions. When investigating the size effect of fine-tuning data on translation quality, we found that already a small, randomly sampled set of fine-tuning directions is sufficient to achieve comparable improvements. The resulting non-English performance is close to the complete translation upper bound. Even in a minimal setting—fine-tuning with only one single sample—the well-known off-target issue is almost completely resolved, explaining parts—but not all—of the observed improvements in translation quality.
Anthology ID:
2024.findings-acl.896
Volume:
Findings of the Association for Computational Linguistics ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand and virtual meeting
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
15092–15108
Language:
URL:
https://aclanthology.org/2024.findings-acl.896
DOI:
Bibkey:
Cite (ACL):
Di Wu, Shaomu Tan, Yan Meng, David Stap, and Christof Monz. 2024. How Far can 100 Samples Go? Unlocking Zero-Shot Translation with Tiny Multi-Parallel Data. In Findings of the Association for Computational Linguistics ACL 2024, pages 15092–15108, Bangkok, Thailand and virtual meeting. Association for Computational Linguistics.
Cite (Informal):
How Far can 100 Samples Go? Unlocking Zero-Shot Translation with Tiny Multi-Parallel Data (Wu et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-acl.896.pdf