Exploring Model Consensus to Generate Translation Paraphrases

Zhenhao Li, Marina Fomicheva, Lucia Specia


Abstract
This paper describes our submission to the 2020 Duolingo Shared Task on Simultaneous Translation And Paraphrase for Language Education (STAPLE). This task focuses on improving the ability of neural MT systems to generate diverse translations. Our submission explores various methods, including N-best translation, Monte Carlo dropout, Diverse Beam Search, Mixture of Experts, Ensembling, and Lexical Substitution. Our main submission is based on the integration of multiple translations from multiple methods using Consensus Voting. Experiments show that the proposed approach achieves a considerable degree of diversity without introducing noisy translations. Our final submission achieves a 0.5510 weighted F1 score on the blind test set for the English-Portuguese track.
Anthology ID:
2020.ngt-1.19
Volume:
Proceedings of the Fourth Workshop on Neural Generation and Translation
Month:
July
Year:
2020
Address:
Online
Editors:
Alexandra Birch, Andrew Finch, Hiroaki Hayashi, Kenneth Heafield, Marcin Junczys-Dowmunt, Ioannis Konstas, Xian Li, Graham Neubig, Yusuke Oda
Venue:
NGT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
161–168
Language:
URL:
https://aclanthology.org/2020.ngt-1.19
DOI:
10.18653/v1/2020.ngt-1.19
Bibkey:
Cite (ACL):
Zhenhao Li, Marina Fomicheva, and Lucia Specia. 2020. Exploring Model Consensus to Generate Translation Paraphrases. In Proceedings of the Fourth Workshop on Neural Generation and Translation, pages 161–168, Online. Association for Computational Linguistics.
Cite (Informal):
Exploring Model Consensus to Generate Translation Paraphrases (Li et al., NGT 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.ngt-1.19.pdf
Video:
 http://slideslive.com/38929833
Code
 Nickeilf/STAPLE20
Data
Duolingo STAPLE Shared Task