ASCM: An Answer Space Clustered Prompting Method without Answer Engineering

Zhen Wang, Yating Yang, Zhou Xi, Bo Ma, Lei Wang, Rui Dong, Azmat Anwar


Abstract
Prompt-based learning, which exploits knowledge from pre-trained language models by providing textual prompts and designing appropriate answer-category mapping methods, has achieved impressive successes on few-shot text classification and natural language inference (NLI). Because of the diverse linguistic expression, there exist many answer tokens for the same category. However, both manual answer design and automatic answer search constrain answer space and therefore hardly achieve ideal performance. To address this issue, we propose an answer space clustered prompting model (ASCM) together with a synonym initialization method (SI) which automatically categorizes all answer tokens in a semantic-clustered embedding space. We also propose a stable semi-supervised method named stair learning (SL) that orderly distills knowledge from better models to weaker models. Extensive experiments demonstrate that our ASCM+SL significantly outperforms existing state-of-the-art techniques in few-shot settings.
Anthology ID:
2022.findings-acl.193
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2455–2469
Language:
URL:
https://aclanthology.org/2022.findings-acl.193
DOI:
10.18653/v1/2022.findings-acl.193
Bibkey:
Cite (ACL):
Zhen Wang, Yating Yang, Zhou Xi, Bo Ma, Lei Wang, Rui Dong, and Azmat Anwar. 2022. ASCM: An Answer Space Clustered Prompting Method without Answer Engineering. In Findings of the Association for Computational Linguistics: ACL 2022, pages 2455–2469, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
ASCM: An Answer Space Clustered Prompting Method without Answer Engineering (Wang et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-acl.193.pdf
Code
 miaomiao1215/ascm
Data
MultiNLI