Flow-Adapter Architecture for Unsupervised Machine Translation

Yihong Liu, Haris Jabbar, Hinrich Schuetze


Abstract
In this work, we propose a flow-adapter architecture for unsupervised NMT. It leverages normalizing flows to explicitly model the distributions of sentence-level latent representations, which are subsequently used in conjunction with the attention mechanism for the translation task. The primary novelties of our model are: (a) capturing language-specific sentence representations separately for each language using normalizing flows and (b) using a simple transformation of these latent representations for translating from one language to another. This architecture allows for unsupervised training of each language independently. While there is prior work on latent variables for supervised MT, to the best of our knowledge, this is the first work that uses latent variables and normalizing flows for unsupervised MT. We obtain competitive results on several unsupervised MT benchmarks.
Anthology ID:
2022.acl-long.89
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1253–1266
Language:
URL:
https://aclanthology.org/2022.acl-long.89
DOI:
10.18653/v1/2022.acl-long.89
Bibkey:
Cite (ACL):
Yihong Liu, Haris Jabbar, and Hinrich Schuetze. 2022. Flow-Adapter Architecture for Unsupervised Machine Translation. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1253–1266, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Flow-Adapter Architecture for Unsupervised Machine Translation (Liu et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.89.pdf