The Benefits of Label-Description Training for Zero-Shot Text Classification

Lingyu Gao, Debanjan Ghosh, Kevin Gimpel


Abstract
Pretrained language models have improved zero-shot text classification by allowing the transfer of semantic knowledge from the training data in order to classify among specific label sets in downstream tasks. We propose a simple way to further improve zero-shot accuracies with minimal effort. We curate small finetuning datasets intended to describe the labels for a task. Unlike typical finetuning data, which has texts annotated with labels, our data simply describes the labels in language, e.g., using a few related terms, dictionary/encyclopedia entries, and short templates. Across a range of topic and sentiment datasets, our method is more accurate than zero-shot by 17-19% absolute. It is also more robust to choices required for zero-shot classification, such as patterns for prompting the model to classify and mappings from labels to tokens in the model’s vocabulary. Furthermore, since our data merely describes the labels but does not use input texts, finetuning on it yields a model that performs strongly on multiple text domains for a given label set, even improving over few-shot out-of-domain classification in multiple settings.
Anthology ID:
2023.emnlp-main.853
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13823–13844
Language:
URL:
https://aclanthology.org/2023.emnlp-main.853
DOI:
10.18653/v1/2023.emnlp-main.853
Bibkey:
Cite (ACL):
Lingyu Gao, Debanjan Ghosh, and Kevin Gimpel. 2023. The Benefits of Label-Description Training for Zero-Shot Text Classification. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 13823–13844, Singapore. Association for Computational Linguistics.
Cite (Informal):
The Benefits of Label-Description Training for Zero-Shot Text Classification (Gao et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.853.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.853.mp4