Target Conditioning for One-to-Many Generation

Marie-Anne Lachaux, Armand Joulin, Guillaume Lample


Abstract
Neural Machine Translation (NMT) models often lack diversity in their generated translations, even when paired with search algorithm, like beam search. A challenge is that the diversity in translations are caused by the variability in the target language, and cannot be inferred from the source sentence alone. In this paper, we propose to explicitly model this one-to-many mapping by conditioning the decoder of a NMT model on a latent variable that represents the domain of target sentences. The domain is a discrete variable generated by a target encoder that is jointly trained with the NMT model. The predicted domain of target sentences are given as input to the decoder during training. At inference, we can generate diverse translations by decoding with different domains. Unlike our strongest baseline (Shen et al., 2019), our method can scale to any number of domains without affecting the performance or the training time. We assess the quality and diversity of translations generated by our model with several metrics, on three different datasets.
Anthology ID:
2020.findings-emnlp.256
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2853–2862
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.256
DOI:
10.18653/v1/2020.findings-emnlp.256
Bibkey:
Cite (ACL):
Marie-Anne Lachaux, Armand Joulin, and Guillaume Lample. 2020. Target Conditioning for One-to-Many Generation. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 2853–2862, Online. Association for Computational Linguistics.
Cite (Informal):
Target Conditioning for One-to-Many Generation (Lachaux et al., Findings 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.findings-emnlp.256.pdf