CCPrefix: Counterfactual Contrastive Prefix-Tuning for Many-Class Classification

Yang Li, Canran Xu, Guodong Long, Tao Shen, Chongyang Tao, Jing Jiang


Abstract
Recently, prefix-tuning was proposed to efficiently adapt pre-trained language models to a broad spectrum of natural language classification tasks. It leverages soft prefix as task-specific indicators and language verbalizers as categorical-label mentions to narrow the formulation gap from pre-training language models. However, when the label space increases considerably (i.e., many-class classification), such a tuning technique suffers from a verbalizer ambiguity problem since the many-class labels are represented by semantic-similar verbalizers in short language phrases. To overcome this, inspired by the human-decision process that the most ambiguous classes would be mulled over for an instance, we propose a brand-new prefix-tuning method, Counterfactual Contrastive Prefix-tuning (CCPrefix), for many-class classification. Basically, an instance-dependent soft prefix, derived from fact-counterfactual pairs in the label space, is leveraged to complement the language verbalizers in many-class classification. We conduct experiments on many-class benchmark datasets in both the fully supervised setting and the few-shot setting, which indicates that our model outperforms former baselines.
Anthology ID:
2024.eacl-long.181
Volume:
Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
March
Year:
2024
Address:
St. Julian’s, Malta
Editors:
Yvette Graham, Matthew Purver
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2977–2988
Language:
URL:
https://aclanthology.org/2024.eacl-long.181
DOI:
Bibkey:
Cite (ACL):
Yang Li, Canran Xu, Guodong Long, Tao Shen, Chongyang Tao, and Jing Jiang. 2024. CCPrefix: Counterfactual Contrastive Prefix-Tuning for Many-Class Classification. In Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers), pages 2977–2988, St. Julian’s, Malta. Association for Computational Linguistics.
Cite (Informal):
CCPrefix: Counterfactual Contrastive Prefix-Tuning for Many-Class Classification (Li et al., EACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.eacl-long.181.pdf
Software:
 2024.eacl-long.181.software.zip
Note:
 2024.eacl-long.181.note.zip