Generating Diverse Translation from Model Distribution with Dropout

Xuanfu Wu, Yang Feng, Chenze Shao


Abstract
Despite the improvement of translation quality, neural machine translation (NMT) often suffers from the lack of diversity in its generation. In this paper, we propose to generate diverse translations by deriving a large number of possible models with Bayesian modelling and sampling models from them for inference. The possible models are obtained by applying concrete dropout to the NMT model and each of them has specific confidence for its prediction, which corresponds to a posterior model distribution under specific training data in the principle of Bayesian modeling. With variational inference, the posterior model distribution can be approximated with a variational distribution, from which the final models for inference are sampled. We conducted experiments on Chinese-English and English-German translation tasks and the results shows that our method makes a better trade-off between diversity and accuracy.
Anthology ID:
2020.emnlp-main.82
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Editors:
Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1088–1097
Language:
URL:
https://aclanthology.org/2020.emnlp-main.82
DOI:
10.18653/v1/2020.emnlp-main.82
Bibkey:
Cite (ACL):
Xuanfu Wu, Yang Feng, and Chenze Shao. 2020. Generating Diverse Translation from Model Distribution with Dropout. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 1088–1097, Online. Association for Computational Linguistics.
Cite (Informal):
Generating Diverse Translation from Model Distribution with Dropout (Wu et al., EMNLP 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.emnlp-main.82.pdf
Video:
 https://slideslive.com/38939337