Learning and Analyzing Generation Order for Undirected Sequence Models

Yichen Jiang, Mohit Bansal


Abstract
Undirected neural sequence models have achieved performance competitive with the state-of-the-art directed sequence models that generate monotonically from left to right in machine translation tasks. In this work, we train a policy that learns the generation order for a pre-trained, undirected translation model via reinforcement learning. We show that the translations decoded by our learned orders achieve higher BLEU scores than the outputs decoded from left to right or decoded by the learned order from Mansimov et al. (2019) on the WMT’14 German-English translation task. On examples with a maximum source and target length of 30 from De-En and WMT’16 English-Romanian tasks, our learned order outperforms all heuristic generation orders on three out of four language pairs. We next carefully analyze the learned order patterns via qualitative and quantitative analysis. We show that our policy generally follows an outer-to-inner order, predicting the left-most and right-most positions first, and then moving toward the middle while skipping less important words at the beginning. Furthermore, the policy usually predicts positions for a single syntactic constituent structure in consecutive steps. We believe our findings could provide more insights on the mechanism of undirected generation models and encourage further research in this direction.
Anthology ID:
2021.findings-emnlp.298
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
3513–3523
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.298
DOI:
10.18653/v1/2021.findings-emnlp.298
Bibkey:
Cite (ACL):
Yichen Jiang and Mohit Bansal. 2021. Learning and Analyzing Generation Order for Undirected Sequence Models. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 3513–3523, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Learning and Analyzing Generation Order for Undirected Sequence Models (Jiang & Bansal, Findings 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.findings-emnlp.298.pdf
Video:
 https://aclanthology.org/2021.findings-emnlp.298.mp4
Code
 jiangyctarheel/undirected-generation