Adapting to Non-Centered Languages for Zero-shot Multilingual Translation

Zhi Qu, Taro Watanabe


Abstract
Multilingual neural machine translation can translate unseen language pairs during training, i.e. zero-shot translation. However, the zero-shot translation is always unstable. Although prior works attributed the instability to the domination of central language, e.g. English, we supplement this viewpoint with the strict dependence of non-centered languages. In this work, we propose a simple, lightweight yet effective language-specific modeling method by adapting to non-centered languages and combining the shared information and the language-specific information to counteract the instability of zero-shot translation. Experiments with Transformer on IWSLT17, Europarl, TED talks, and OPUS-100 datasets show that our method not only performs better than strong baselines in centered data conditions but also can easily fit non-centered data conditions. By further investigating the layer attribution, we show that our proposed method can disentangle the coupled representation in the correct direction.
Anthology ID:
2022.coling-1.467
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
5251–5265
Language:
URL:
https://aclanthology.org/2022.coling-1.467
DOI:
Bibkey:
Cite (ACL):
Zhi Qu and Taro Watanabe. 2022. Adapting to Non-Centered Languages for Zero-shot Multilingual Translation. In Proceedings of the 29th International Conference on Computational Linguistics, pages 5251–5265, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Adapting to Non-Centered Languages for Zero-shot Multilingual Translation (Qu & Watanabe, COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.467.pdf
Code
 zhiqu22/adapnoncenter
Data
OPUS-100