Look-back Decoding for Open-Ended Text Generation

Nan Xu, Chunting Zhou, Asli Celikyilmaz, Xuezhe Ma


Abstract
Given a prefix (context), open-ended generation aims to decode texts that are coherent, which do not abruptly drift from previous topics, and informative, which do not suffer from undesired repetitions. In this paper, we propose Look-back, an improved decoding algorithm that leverages the Kullback–Leibler divergence to track the distribution distance between current and historical decoding steps. Thus Look-back can automatically predict potential repetitive phrase and topic drift, and remove tokens that may cause the failure modes, restricting the next token probability distribution within a plausible distance to the history. We perform decoding experiments on document continuation and story generation, and demonstrate that Look-back is able to generate more fluent and coherent text, outperforming other strong decoding methods significantly in both automatic and human evaluations.
Anthology ID:
2023.emnlp-main.66
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1039–1050
Language:
URL:
https://aclanthology.org/2023.emnlp-main.66
DOI:
10.18653/v1/2023.emnlp-main.66
Bibkey:
Cite (ACL):
Nan Xu, Chunting Zhou, Asli Celikyilmaz, and Xuezhe Ma. 2023. Look-back Decoding for Open-Ended Text Generation. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 1039–1050, Singapore. Association for Computational Linguistics.
Cite (Informal):
Look-back Decoding for Open-Ended Text Generation (Xu et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.66.pdf