Non-Autoregressive Neural Machine Translation: A Call for Clarity

Robin Schmidt, Telmo Pires, Stephan Peitz, Jonas Lööf


Abstract
Non-autoregressive approaches aim to improve the inference speed of translation models by only requiring a single forward pass to generate the output sequence instead of iteratively producing each predicted token. Consequently, their translation quality still tends to be inferior to their autoregressive counterparts due to several issues involving output token interdependence. In this work, we take a step back and revisit several techniques that have been proposed for improving non-autoregressive translation models and compare their combined translation quality and speed implications under third-party testing environments. We provide novel insights for establishing strong baselines using length prediction or CTC-based architecture variants and contribute standardized BLEU, chrF++, and TER scores using sacreBLEU on four translation tasks, which crucially have been missing as inconsistencies in the use of tokenized BLEU lead to deviations of up to 1.7 BLEU points. Our open-sourced code is integrated into fairseq for reproducibility.
Anthology ID:
2022.emnlp-main.179
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2785–2799
Language:
URL:
https://aclanthology.org/2022.emnlp-main.179
DOI:
10.18653/v1/2022.emnlp-main.179
Bibkey:
Cite (ACL):
Robin Schmidt, Telmo Pires, Stephan Peitz, and Jonas Lööf. 2022. Non-Autoregressive Neural Machine Translation: A Call for Clarity. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 2785–2799, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Non-Autoregressive Neural Machine Translation: A Call for Clarity (Schmidt et al., EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-main.179.pdf