Scalable Prompt Generation for Semi-supervised Learning with Language Models

Yuhang Zhou, Suraj Maharjan, Beiye Liu


Abstract
Prompt-based learning methods in semi-supervised learning (SSL) settings have been shown to be effective on multiple natural language understanding (NLU) datasets and tasks in the literature. However, manually designing multiple prompts and verbalizers requires domain knowledge and human effort, making it difficult and expensive to scale across different datasets. In this paper, we propose two methods to automatically design multiple prompts and integrate automatic verbalizer in SSL settings without sacrificing performance. The first method uses various demonstration examples with learnable continuous prompt tokens to create diverse prompt models. The second method uses a varying number of soft prompt tokens to encourage language models to learn different prompts. For the verbalizer, we use the prototypical verbalizer to replace the manual one. In summary, we obtained the best average accuracy of 71.5% (a relative improvement of 0.99% over even the previous state-of-the-art SSL method with manual prompts and verbalizers) in different few-shot learning settings.
Anthology ID:
2023.findings-eacl.58
Volume:
Findings of the Association for Computational Linguistics: EACL 2023
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
770–781
Language:
URL:
https://aclanthology.org/2023.findings-eacl.58
DOI:
10.18653/v1/2023.findings-eacl.58
Bibkey:
Cite (ACL):
Yuhang Zhou, Suraj Maharjan, and Beiye Liu. 2023. Scalable Prompt Generation for Semi-supervised Learning with Language Models. In Findings of the Association for Computational Linguistics: EACL 2023, pages 770–781, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Scalable Prompt Generation for Semi-supervised Learning with Language Models (Zhou et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-eacl.58.pdf
Video:
 https://aclanthology.org/2023.findings-eacl.58.mp4