Bounding the Capabilities of Large Language Models in Open Text Generation with Prompt Constraints

Albert Lu, Hongxin Zhang, Yanzhe Zhang, Xuezhi Wang, Diyi Yang


Abstract
The limits of open-ended generative models are unclear, yet increasingly important. What causes them to succeed and what causes them to fail? In this paper, we take a prompt-centric approach to analyzing and bounding the abilities of open-ended generative models. We present a generic methodology of analysis with two challenging prompt constraint types: structural and stylistic. These constraint types are categorized into a set of well-defined constraints that are analyzable by a single prompt. We then systematically create a diverse set of simple, natural, and useful prompts to robustly analyze each individual constraint. Using the GPT-3 text-davinci-002 model as a case study, we generate outputs from our collection of prompts and analyze the model’s generative failures. We also show the generalizability of our proposed method on other large models like BLOOM and OPT. Our results and our in-context mitigation strategies reveal open challenges for future research.
Anthology ID:
2023.findings-eacl.148
Volume:
Findings of the Association for Computational Linguistics: EACL 2023
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1982–2008
Language:
URL:
https://aclanthology.org/2023.findings-eacl.148
DOI:
10.18653/v1/2023.findings-eacl.148
Bibkey:
Cite (ACL):
Albert Lu, Hongxin Zhang, Yanzhe Zhang, Xuezhi Wang, and Diyi Yang. 2023. Bounding the Capabilities of Large Language Models in Open Text Generation with Prompt Constraints. In Findings of the Association for Computational Linguistics: EACL 2023, pages 1982–2008, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Bounding the Capabilities of Large Language Models in Open Text Generation with Prompt Constraints (Lu et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-eacl.148.pdf
Video:
 https://aclanthology.org/2023.findings-eacl.148.mp4