ConEntail: An Entailment-based Framework for Universal Zero and Few Shot Classification with Supervised Contrastive Pretraining

Ranran Haoran Zhang, Aysa Xuemo Fan, Rui Zhang


Abstract
A universal classification model aims to generalize to diverse classification tasks in both zero and few shot settings. A promising way toward universal classification is to cast heterogeneous data formats into a dataset-agnostic “meta-task” (e.g., textual entailment, question answering) then pretrain a model on the combined meta dataset. The existing work is either pretrained on specific subsets of classification tasks, or pretrained on both classification and generation data but the model could not fulfill its potential in universality and reliability. These also leave a massive amount of annotated data under-exploited. To fill these gaps, we propose ConEntail, a new framework for universal zero and few shot classification with supervised contrastive pretraining. Our unified meta-task for classification is based on nested entailment. It can be interpreted as “Does sentence a entails [sentence b entails label c]”. This formulation enables us to make better use of 57 annotated classification datasets for supervised contrastive pretraining and universal evaluation. In this way, ConEntail helps the model (1) absorb knowledge from different datasets, and (2) gain consistent performance gain with more pretraining data. In experiments, we compare our model with discriminative and generative models pretrained on the same dataset. The results confirm that our framework effectively exploits existing annotated data and consistently outperforms baselines in both zero (9.4% average improvement) and few shot settings (3.5% average improvement). Our code is available in supplementary materials.
Anthology ID:
2023.eacl-main.142
Volume:
Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1941–1953
Language:
URL:
https://aclanthology.org/2023.eacl-main.142
DOI:
10.18653/v1/2023.eacl-main.142
Bibkey:
Cite (ACL):
Ranran Haoran Zhang, Aysa Xuemo Fan, and Rui Zhang. 2023. ConEntail: An Entailment-based Framework for Universal Zero and Few Shot Classification with Supervised Contrastive Pretraining. In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics, pages 1941–1953, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
ConEntail: An Entailment-based Framework for Universal Zero and Few Shot Classification with Supervised Contrastive Pretraining (Zhang et al., EACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.eacl-main.142.pdf
Software:
 2023.eacl-main.142.software.zip
Video:
 https://aclanthology.org/2023.eacl-main.142.mp4