Exploiting Cloze-Questions for Few-Shot Text Classification and Natural Language Inference

Timo Schick, Hinrich Schütze


Abstract
Some NLP tasks can be solved in a fully unsupervised fashion by providing a pretrained language model with “task descriptions” in natural language (e.g., Radford et al., 2019). While this approach underperforms its supervised counterpart, we show in this work that the two ideas can be combined: We introduce Pattern-Exploiting Training (PET), a semi-supervised training procedure that reformulates input examples as cloze-style phrases to help language models understand a given task. These phrases are then used to assign soft labels to a large set of unlabeled examples. Finally, standard supervised training is performed on the resulting training set. For several tasks and languages, PET outperforms supervised training and strong semi-supervised approaches in low-resource settings by a large margin.
Anthology ID:
2021.eacl-main.20
Volume:
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume
Month:
April
Year:
2021
Address:
Online
Editors:
Paola Merlo, Jorg Tiedemann, Reut Tsarfaty
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
255–269
Language:
URL:
https://aclanthology.org/2021.eacl-main.20
DOI:
10.18653/v1/2021.eacl-main.20
Bibkey:
Cite (ACL):
Timo Schick and Hinrich Schütze. 2021. Exploiting Cloze-Questions for Few-Shot Text Classification and Natural Language Inference. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume, pages 255–269, Online. Association for Computational Linguistics.
Cite (Informal):
Exploiting Cloze-Questions for Few-Shot Text Classification and Natural Language Inference (Schick & Schütze, EACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.eacl-main.20.pdf
Code
 timoschick/pet
Data
MultiNLIx-stance