Assessing Non-autoregressive Alignment in Neural Machine Translation via Word Reordering

Chun-Hin Tse, Ester Leung, William K. Cheung


Abstract
Recent work on non-autoregressive neural machine translation (NAT) that leverages alignment information to explicitly reduce the modality of target distribution has reported comparable performance with counterparts that tackle multi-modality problem by implicitly modeling dependencies. Effectiveness in handling alignment is vital for models that follow this approach, where a token reordering mechanism is typically involved and plays a vital role. We review the reordering capability of the respective mechanisms in recent NAT models, and our experimental results show that their performance is sub-optimal. We propose to learn a non-autoregressive language model (NALM) based on transformer which can be combined with Viterbi decoding to achieve better reordering performance. We evaluate the proposed NALM using the PTB dataset where sentences with words permuted in different ways are expected to have their ordering recovered. Our empirical results show that the proposed method can outperform the state-of-the-art reordering mechanisms under different word permutation settings, with a 2-27 BLEU improvement, suggesting high potential for word alignment in NAT.
Anthology ID:
2022.findings-emnlp.172
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2327–2333
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.172
DOI:
10.18653/v1/2022.findings-emnlp.172
Bibkey:
Cite (ACL):
Chun-Hin Tse, Ester Leung, and William K. Cheung. 2022. Assessing Non-autoregressive Alignment in Neural Machine Translation via Word Reordering. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 2327–2333, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Assessing Non-autoregressive Alignment in Neural Machine Translation via Word Reordering (Tse et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-emnlp.172.pdf