Self-Improvement of Non-autoregressive Model via Sequence-Level Distillation

Yusheng Liao, Shuyang Jiang, Yiqi Li, Yu Wang, Yanfeng Wang


Abstract
Although Non-autoregressive Transformer (NAT) models have achieved great success in terms of fast inference speed, this speedup comes with a performance drop due to the inherent multi-modality problem of the NAT model. Previous works commonly alleviate this problem by replacing the target side of the raw data with distilled data generated by Autoregressive Transformer (AT) models. However, the multi-modality problem in the distilled data is still significant and thus limits further improvement of the NAT models. In this paper, we propose a method called Sequence-Level Self-Distillation (SLSD), which aims to generate distilled data by the NAT model itself, eliminating the need for additional teacher networks. Furthermore, SLSD can adapt to different NAT models without precise adjustments since the self-distilled data is generated from the same types of NAT models. We conduct extensive experiments on WMT14 ENDE and WMT16 ENRO and choose four classic NAT models as the backbones to validate the generality and effectiveness of SLSD. The results show that our approach can consistently improve all models on both raw data and distilled data without sacrificing the inference speed.
Anthology ID:
2023.emnlp-main.878
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14202–14212
Language:
URL:
https://aclanthology.org/2023.emnlp-main.878
DOI:
10.18653/v1/2023.emnlp-main.878
Bibkey:
Cite (ACL):
Yusheng Liao, Shuyang Jiang, Yiqi Li, Yu Wang, and Yanfeng Wang. 2023. Self-Improvement of Non-autoregressive Model via Sequence-Level Distillation. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 14202–14212, Singapore. Association for Computational Linguistics.
Cite (Informal):
Self-Improvement of Non-autoregressive Model via Sequence-Level Distillation (Liao et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.878.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.878.mp4