PRINCE: Prefix-Masked Decoding for Knowledge Enhanced Sequence-to-Sequence Pre-Training

Song Xu, Haoran Li, Peng Yuan, Youzheng Wu, Xiaodong He


Abstract
Pre-trained Language Models (PLMs) have shown effectiveness in various Natural Language Processing (NLP) tasks. Denoising autoencoder is one of the most successful pre-training frameworks, learning to recompose the original text given a noise-corrupted one. The existing studies mainly focus on injecting noises into the input. This paper introduces a simple yet effective pre-training paradigm, equipped with a knowledge-enhanced decoder that predicts the next entity token with noises in the prefix, explicitly strengthening the representation learning of entities that span over multiple input tokens. Specifically, when predicting the next token within an entity, we feed masks into the prefix in place of some of the previous ground-truth tokens that constitute the entity. Our model achieves new state-of-the-art results on two knowledge-driven data-to-text generation tasks with up to 2% BLEU gains.
Anthology ID:
2022.emnlp-main.171
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2675–2681
Language:
URL:
https://aclanthology.org/2022.emnlp-main.171
DOI:
10.18653/v1/2022.emnlp-main.171
Bibkey:
Cite (ACL):
Song Xu, Haoran Li, Peng Yuan, Youzheng Wu, and Xiaodong He. 2022. PRINCE: Prefix-Masked Decoding for Knowledge Enhanced Sequence-to-Sequence Pre-Training. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 2675–2681, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
PRINCE: Prefix-Masked Decoding for Knowledge Enhanced Sequence-to-Sequence Pre-Training (Xu et al., EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-main.171.pdf