Hybrid-Regressive Paradigm for Accurate and Speed-Robust Neural Machine Translation

Qiang Wang, Xinhui Hu, Ming Chen


Abstract
This work empirically confirms that non-autoregressive translation (NAT) is less robust in decoding batch size and hardware settings than autoregressive translation (AT). To address this issue, we demonstrate that prompting a small number of AT predictions can significantly reduce the performance gap between AT and NAT through synthetic experiments. Following this line, we propose hybrid-regressive translation (HRT), a two-stage translation prototype that combines the strengths of AT and NAT. Specifically, HRT first generates discontinuous sequences via autoregression (e.g., make a prediction for every k tokens, k>1) and then fills in all previously skipped tokens at once in a non-autoregressive manner. Experiments on five translation tasks show that HRT achieves comparable translation quality with AT while having at least 1.5x faster inference regardless of batch size and device. Additionally, HRT successfully inherits the sound characteristics of AT in the deep-encoder-shallow-decoder architecture, allowing for further speedup without BLEU loss.
Anthology ID:
2023.findings-acl.367
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5931–5945
Language:
URL:
https://aclanthology.org/2023.findings-acl.367
DOI:
10.18653/v1/2023.findings-acl.367
Bibkey:
Cite (ACL):
Qiang Wang, Xinhui Hu, and Ming Chen. 2023. Hybrid-Regressive Paradigm for Accurate and Speed-Robust Neural Machine Translation. In Findings of the Association for Computational Linguistics: ACL 2023, pages 5931–5945, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Hybrid-Regressive Paradigm for Accurate and Speed-Robust Neural Machine Translation (Wang et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.367.pdf