Parallel Refinements for Lexically Constrained Text Generation with BART

Xingwei He


Abstract
Lexically constrained text generation aims to control the generated text by incorporating certain pre-specified keywords into the output. Previous work injects lexical constraints into the output by controlling the decoding process or refining the candidate output iteratively, which tends to generate generic or ungrammatical sentences, and has high computational complexity. To address these challenges, we proposed Constrained BART (CBART) for lexically constrained text generation. CBART leverages the pre-trained model, BART and transfers part of the generation burden from the decoder to the encoder by decomposing this task into two sub-tasks, thereby improving the sentence quality. Concretely, we extended BART by adding a token-level classifier over the encoder, aiming at instructing the decoder where to replace and insert. Guided by the encoder, the decoder refines multiple tokens of the input in one step by inserting tokens before specific positions and re-predicting tokens at a low confidence level. To further reduce the inference latency, the decoder predicts all tokens in parallel. Experiment results on One-Billion-Word and Yelp show that CBART can generate plausible text with high quality and diversity while largely accelerating inference.
Anthology ID:
2021.emnlp-main.681
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8653–8666
Language:
URL:
https://aclanthology.org/2021.emnlp-main.681
DOI:
10.18653/v1/2021.emnlp-main.681
Bibkey:
Cite (ACL):
Xingwei He. 2021. Parallel Refinements for Lexically Constrained Text Generation with BART. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 8653–8666, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Parallel Refinements for Lexically Constrained Text Generation with BART (He, EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.681.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.681.mp4
Code
 nlpcode/cbart