Prompt-Based Length Controlled Generation with Multiple Control Types

Renlong Jie, Xiaojun Meng, Lifeng Shang, Xin Jiang, Qun Liu


Abstract
Large language models (LLMs) have attracted great attention given their strong performance on a wide range of NLP tasks. In practice, users often expect generated texts to fall within a specific length range, making length controlled generation an important topic, especially for GPT-style models. Existing length control methods mostly focus on a simple control type of “equal to” a target length. Different from them, we propose a prompt-based method to achieve length controlled generation under different control types with high accuracy. In particular, we adopt reinforcement learning (RL) and sample filtering with the reward signal given by rule-based reward models, which enhances the length control ability of models by rewarding outputs that follow certain control instructions. In addition, we introduce a standard prompt extractor to parse arbitrary users’ input into standard control instructions. Experiments show that our method significantly improves the accuracy of prompt-based length control on popular summarization datasets like CNNDM and NYT under multiple control types. Moreover, both the standard prompt extractor and RL-tuned model show strong generalization to unseen control prompt templates.
Anthology ID:
2024.findings-acl.63
Volume:
Findings of the Association for Computational Linguistics ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand and virtual meeting
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1067–1085
Language:
URL:
https://aclanthology.org/2024.findings-acl.63
DOI:
Bibkey:
Cite (ACL):
Renlong Jie, Xiaojun Meng, Lifeng Shang, Xin Jiang, and Qun Liu. 2024. Prompt-Based Length Controlled Generation with Multiple Control Types. In Findings of the Association for Computational Linguistics ACL 2024, pages 1067–1085, Bangkok, Thailand and virtual meeting. Association for Computational Linguistics.
Cite (Informal):
Prompt-Based Length Controlled Generation with Multiple Control Types (Jie et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-acl.63.pdf