Non-Autoregressive Models for Fast Sequence Generation

Yang Feng, Chenze Shao


Abstract
Autoregressive (AR) models have achieved great success in various sequence generation tasks. However, AR models can only generate target sequence word-by-word due to the AR mechanism and hence suffer from slow inference. Recently, non-autoregressive (NAR) models, which generate all the tokens in parallel by removing the sequential dependencies within the target sequence, have received increasing attention in sequence generation tasks such as neural machine translation (NMT), automatic speech recognition (ASR), and text to speech (TTS). In this tutorial, we will provide a comprehensive introduction to non-autoregressive sequence generation.
Anthology ID:
2022.emnlp-tutorials.6
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing: Tutorial Abstracts
Month:
December
Year:
2022
Address:
Abu Dubai, UAE
Editors:
Samhaa R. El-Beltagy, Xipeng Qiu
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
30–35
Language:
URL:
https://aclanthology.org/2022.emnlp-tutorials.6
DOI:
10.18653/v1/2022.emnlp-tutorials.6
Bibkey:
Cite (ACL):
Yang Feng and Chenze Shao. 2022. Non-Autoregressive Models for Fast Sequence Generation. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing: Tutorial Abstracts, pages 30–35, Abu Dubai, UAE. Association for Computational Linguistics.
Cite (Informal):
Non-Autoregressive Models for Fast Sequence Generation (Feng & Shao, EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-tutorials.6.pdf