Bridging the Gap between Training and Inference: Multi-Candidate Optimization for Diverse Neural Machine Translation

Huan Lin, Baosong Yang, Liang Yao, Dayiheng Liu, Haibo Zhang, Jun Xie, Min Zhang, Jinsong Su


Abstract
Diverse NMT aims at generating multiple diverse yet faithful translations given a source sentence. In this paper, we investigate a common shortcoming in existing diverse NMT studies: the model is usually trained with single reference, while expected to generate multiple candidate translations in inference. The discrepancy between training and inference enlarges the confidence variance and quality gap among candidate translations and thus hinders model performance. To deal with this defect, we propose a multi-candidate optimization framework for diverse NMT. Specifically, we define assessments to score the diversity and the quality of candidate translations during training, and optimize the diverse NMT model with two strategies based on reinforcement learning, namely hard constrained training and soft constrained training. We conduct experiments on NIST Chinese-English and WMT14 English-German translation tasks. The results illustrate that our framework is transparent to basic diverse NMT models, and universally makes better trade-off between diversity and quality. Our source codeis available at https://github.com/DeepLearnXMU/MultiCanOptim.
Anthology ID:
2022.findings-naacl.200
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2622–2632
Language:
URL:
https://aclanthology.org/2022.findings-naacl.200
DOI:
10.18653/v1/2022.findings-naacl.200
Bibkey:
Cite (ACL):
Huan Lin, Baosong Yang, Liang Yao, Dayiheng Liu, Haibo Zhang, Jun Xie, Min Zhang, and Jinsong Su. 2022. Bridging the Gap between Training and Inference: Multi-Candidate Optimization for Diverse Neural Machine Translation. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 2622–2632, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Bridging the Gap between Training and Inference: Multi-Candidate Optimization for Diverse Neural Machine Translation (Lin et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-naacl.200.pdf
Software:
 2022.findings-naacl.200.software.zip
Code
 deeplearnxmu/multicanoptim