Kyoto University MT System Description for IWSLT 2017

Raj Dabre, Fabien Cromieres, Sadao Kurohashi


Abstract
We describe here our Machine Translation (MT) model and the results we obtained for the IWSLT 2017 Multilingual Shared Task. Motivated by Zero Shot NMT [1] we trained a Multilingual Neural Machine Translation by combining all the training data into one single collection by appending the tokens to the source sentences in order to indicate the target language they should be translated to. We observed that even in a low resource situation we were able to get translations whose quality surpass the quality of those obtained by Phrase Based Statistical Machine Translation by several BLEU points. The most surprising result we obtained was in the zero shot setting for Dutch-German and Italian-Romanian where we observed that despite using no parallel corpora between these language pairs, the NMT model was able to translate between these languages and the translations were either as good as or better (in terms of BLEU) than the non zero resource setting. We also verify that the NMT models that use feed forward layers and self attention instead of recurrent layers are extremely fast in terms of training which is useful in a NMT experimental setting.
Anthology ID:
2017.iwslt-1.8
Volume:
Proceedings of the 14th International Conference on Spoken Language Translation
Month:
December 14-15
Year:
2017
Address:
Tokyo, Japan
Editors:
Sakriani Sakti, Masao Utiyama
Venue:
IWSLT
SIG:
SIGSLT
Publisher:
International Workshop on Spoken Language Translation
Note:
Pages:
55–59
Language:
URL:
https://aclanthology.org/2017.iwslt-1.8
DOI:
Bibkey:
Cite (ACL):
Raj Dabre, Fabien Cromieres, and Sadao Kurohashi. 2017. Kyoto University MT System Description for IWSLT 2017. In Proceedings of the 14th International Conference on Spoken Language Translation, pages 55–59, Tokyo, Japan. International Workshop on Spoken Language Translation.
Cite (Informal):
Kyoto University MT System Description for IWSLT 2017 (Dabre et al., IWSLT 2017)
Copy Citation:
PDF:
https://aclanthology.org/2017.iwslt-1.8.pdf