Neural Machine Translation with Phrase-Level Universal Visual Representations

Qingkai Fang, Yang Feng


Abstract
Multimodal machine translation (MMT) aims to improve neural machine translation (NMT) with additional visual information, but most existing MMT methods require paired input of source sentence and image, which makes them suffer from shortage of sentence-image pairs. In this paper, we propose a phrase-level retrieval-based method for MMT to get visual information for the source input from existing sentence-image data sets so that MMT can break the limitation of paired sentence-image input. Our method performs retrieval at the phrase level and hence learns visual information from pairs of source phrase and grounded region, which can mitigate data sparsity. Furthermore, our method employs the conditional variational auto-encoder to learn visual representations which can filter redundant visual information and only retain visual information related to the phrase. Experiments show that the proposed method significantly outperforms strong baselines on multiple MMT datasets, especially when the textual context is limited.
Anthology ID:
2022.acl-long.390
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5687–5698
Language:
URL:
https://aclanthology.org/2022.acl-long.390
DOI:
10.18653/v1/2022.acl-long.390
Bibkey:
Cite (ACL):
Qingkai Fang and Yang Feng. 2022. Neural Machine Translation with Phrase-Level Universal Visual Representations. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 5687–5698, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Neural Machine Translation with Phrase-Level Universal Visual Representations (Fang & Feng, ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.390.pdf
Code
 ictnlp/pluvr