Data Augmentation for Intent Classification with Off-the-shelf Large Language Models

Gaurav Sahu, Pau Rodriguez, Issam Laradji, Parmida Atighehchian, David Vazquez, Dzmitry Bahdanau


Abstract
Data augmentation is a widely employed technique to alleviate the problem of data scarcity. In this work, we propose a prompting-based approach to generate labelled training data for intent classification with off-the-shelf language models (LMs) such as GPT-3. An advantage of this method is that no task-specific LM-fine-tuning for data generation is required; hence the method requires no hyper parameter tuning and is applicable even when the available training data is very scarce. We evaluate the proposed method in a few-shot setting on four diverse intent classification tasks. We find that GPT-generated data significantly boosts the performance of intent classifiers when intents in consideration are sufficiently distinct from each other. In tasks with semantically close intents, we observe that the generated data is less helpful. Our analysis shows that this is because GPT often generates utterances that belong to a closely-related intent instead of the desired one. We present preliminary evidence that a prompting-based GPT classifier could be helpful in filtering the generated data to enhance its quality.
Anthology ID:
2022.nlp4convai-1.5
Volume:
Proceedings of the 4th Workshop on NLP for Conversational AI
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venues:
ACL | NLP4ConvAI
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
47–57
Language:
URL:
https://aclanthology.org/2022.nlp4convai-1.5
DOI:
10.18653/v1/2022.nlp4convai-1.5
Bibkey:
Cite (ACL):
Gaurav Sahu, Pau Rodriguez, Issam Laradji, Parmida Atighehchian, David Vazquez, and Dzmitry Bahdanau. 2022. Data Augmentation for Intent Classification with Off-the-shelf Large Language Models. In Proceedings of the 4th Workshop on NLP for Conversational AI, pages 47–57, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Data Augmentation for Intent Classification with Off-the-shelf Large Language Models (Sahu et al., NLP4ConvAI 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.nlp4convai-1.5.pdf
Code
 elementai/data-augmentation-with-llms
Data
CLINC150