Informative Language Representation Learning for Massively Multilingual Neural Machine Translation

Renren Jin, Deyi Xiong


Abstract
In a multilingual neural machine translation model that fully shares parameters across all languages, an artificial language token is usually used to guide translation into the desired target language. However, recent studies show that prepending language tokens sometimes fails to navigate the multilingual neural machine translation models into right translation directions, especially on zero-shot translation. To mitigate this issue, we propose two methods, language embedding embodiment and language-aware multi-head attention, to learn informative language representations to channel translation into right directions. The former embodies language embeddings into different critical switching points along the information flow from the source to the target, aiming at amplifying translation direction guiding signals. The latter exploits a matrix, instead of a vector, to represent a language in the continuous space. The matrix is chunked into multiple heads so as to learn language representations in multiple subspaces. Experiment results on two datasets for massively multilingual neural machine translation demonstrate that language-aware multi-head attention benefits both supervised and zero-shot translation and significantly alleviates the off-target translation issue. Further linguistic typology prediction experiments show that matrix-based language representations learned by our methods are capable of capturing rich linguistic typology features.
Anthology ID:
2022.coling-1.458
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Editors:
Nicoletta Calzolari, Chu-Ren Huang, Hansaem Kim, James Pustejovsky, Leo Wanner, Key-Sun Choi, Pum-Mo Ryu, Hsin-Hsi Chen, Lucia Donatelli, Heng Ji, Sadao Kurohashi, Patrizia Paggio, Nianwen Xue, Seokhwan Kim, Younggyun Hahm, Zhong He, Tony Kyungil Lee, Enrico Santus, Francis Bond, Seung-Hoon Na
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
5158–5174
Language:
URL:
https://aclanthology.org/2022.coling-1.458
DOI:
Bibkey:
Cite (ACL):
Renren Jin and Deyi Xiong. 2022. Informative Language Representation Learning for Massively Multilingual Neural Machine Translation. In Proceedings of the 29th International Conference on Computational Linguistics, pages 5158–5174, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Informative Language Representation Learning for Massively Multilingual Neural Machine Translation (Jin & Xiong, COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.458.pdf
Code
 cordercorder/nmt-multi
Data
OPUS-100