TMU Japanese-English Multimodal Machine Translation System for WAT 2020

Hiroto Tamura, Tosho Hirasawa, Masahiro Kaneko, Mamoru Komachi


Abstract
We introduce our TMU system submitted to the Japanese<->English Multimodal Task (constrained) for WAT 2020 (Nakazawa et al., 2020). This task aims to improve translation performance with the help of another modality (images) associated with the input sentences. In a multimodal translation task, the dataset is, by its nature, a low-resource one. Our method used herein augments the data by generating noisy translations and adding noise to existing training images. Subsequently, we pretrain a translation model on the augmented noisy data, and then fine-tune it on the clean data. We also examine the probabilistic dropping of either the textual or visual context vector in the decoder. This aims to regularize the network to make use of both features while training. The experimental results indicate that translation performance can be improved using our method of textual data augmentation with noising on the target side and probabilistic dropping of either context vector.
Anthology ID:
2020.wat-1.7
Volume:
Proceedings of the 7th Workshop on Asian Translation
Month:
December
Year:
2020
Address:
Suzhou, China
Editors:
Toshiaki Nakazawa, Hideki Nakayama, Chenchen Ding, Raj Dabre, Anoop Kunchukuttan, Win Pa Pa, Ondřej Bojar, Shantipriya Parida, Isao Goto, Hidaya Mino, Hiroshi Manabe, Katsuhito Sudoh, Sadao Kurohashi, Pushpak Bhattacharyya
Venue:
WAT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
80–91
Language:
URL:
https://aclanthology.org/2020.wat-1.7
DOI:
Bibkey:
Cite (ACL):
Hiroto Tamura, Tosho Hirasawa, Masahiro Kaneko, and Mamoru Komachi. 2020. TMU Japanese-English Multimodal Machine Translation System for WAT 2020. In Proceedings of the 7th Workshop on Asian Translation, pages 80–91, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
TMU Japanese-English Multimodal Machine Translation System for WAT 2020 (Tamura et al., WAT 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.wat-1.7.pdf