PromptGen: Automatically Generate Prompts using Generative Models

Yue Zhang, Hongliang Fei, Dingcheng Li, Ping Li


Abstract
Recently, prompt learning has received significant attention, where the downstream tasks are reformulated to the mask-filling task with the help of a textual prompt. The key point of prompt learning is finding the most appropriate prompt. This paper proposes a novel model PromptGen, which can automatically generate prompts conditional on the input sentence. PromptGen is the first work considering dynamic prompt generation for knowledge probing, based on a pre-trained generative model. To mitigate any label information leaking from the pre-trained generative model, when given a generated prompt, we replace the query input with “None”. We pursue that this perturbed context-free prompt cannot trigger the correct label. We evaluate our model on the knowledge probing LAMA benchmark, and show that PromptGen significantly outperforms other baselines.
Anthology ID:
2022.findings-naacl.3
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
30–37
Language:
URL:
https://aclanthology.org/2022.findings-naacl.3
DOI:
10.18653/v1/2022.findings-naacl.3
Bibkey:
Cite (ACL):
Yue Zhang, Hongliang Fei, Dingcheng Li, and Ping Li. 2022. PromptGen: Automatically Generate Prompts using Generative Models. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 30–37, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
PromptGen: Automatically Generate Prompts using Generative Models (Zhang et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-naacl.3.pdf
Software:
 2022.findings-naacl.3.software.zip
Data
LAMA