Isotropy-Enhanced Conditional Masked Language Models

Pei Guo, Yisheng Xiao, Juntao Li, Yixin Ji, Min Zhang


Abstract
Non-autoregressive models have been widely used for various text generation tasks to accelerate the inference process but at the cost of generation quality to some extent. To achieve a good balance between inference speedup and generation quality, iterative NAR models like CMLM and Disco are proposed. Researchers have made much follow-up progress based on them, and some recent iterative models can achieve very promising performance while maintaining significant speedup. In this paper, we give more insights into iterative NAR models by exploring the anisotropic problem, i.e., the representations of distinct predicted target tokens are similar and indiscriminative. Upon the confirmation of the anisotropic problem in iterative NAR models, we first analyze the effectiveness of the contrastive learning method and further propose the Look Neighbors strategy to enhance the learning of token representations during training. Experiments on 4 WMT datasets show that our methods consistently improve the performance as well as alleviate the anisotropic problem of the conditional masked language model, even outperforming the current SoTA result on WMT14 EN DE.
Anthology ID:
2023.findings-emnlp.555
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8278–8289
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.555
DOI:
10.18653/v1/2023.findings-emnlp.555
Bibkey:
Cite (ACL):
Pei Guo, Yisheng Xiao, Juntao Li, Yixin Ji, and Min Zhang. 2023. Isotropy-Enhanced Conditional Masked Language Models. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 8278–8289, Singapore. Association for Computational Linguistics.
Cite (Informal):
Isotropy-Enhanced Conditional Masked Language Models (Guo et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.555.pdf