Pre-training Intent-Aware Encoders for Zero- and Few-Shot Intent Classification

Mujeen Sung, James Gung, Elman Mansimov, Nikolaos Pappas, Raphael Shu, Salvatore Romeo, Yi Zhang, Vittorio Castelli


Abstract
Intent classification (IC) plays an important role in task-oriented dialogue systems. However, IC models often generalize poorly when training without sufficient annotated examples for each user intent. We propose a novel pre-training method for text encoders that uses contrastive learning with intent psuedo-labels to produce embeddings that are well-suited for IC tasks, reducing the need for manual annotations. By applying this pre-training strategy, we also introduce Pre-trained Intent-aware Encoder (PIE), which is designed to align encodings of utterances with their intent names. Specifically, we first train a tagger to identify key phrases within utterances that are crucial for interpreting intents. We then use these extracted phrases to create examples for pre-training a text encoder in a contrastive manner. As a result, our PIE model achieves up to 5.4% and 4.0% higher accuracy than the previous state-of-the-art pre-trained text encoder for the N-way zero- and one-shot settings on four IC datasets.
Anthology ID:
2023.emnlp-main.646
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10433–10442
Language:
URL:
https://aclanthology.org/2023.emnlp-main.646
DOI:
10.18653/v1/2023.emnlp-main.646
Bibkey:
Cite (ACL):
Mujeen Sung, James Gung, Elman Mansimov, Nikolaos Pappas, Raphael Shu, Salvatore Romeo, Yi Zhang, and Vittorio Castelli. 2023. Pre-training Intent-Aware Encoders for Zero- and Few-Shot Intent Classification. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 10433–10442, Singapore. Association for Computational Linguistics.
Cite (Informal):
Pre-training Intent-Aware Encoders for Zero- and Few-Shot Intent Classification (Sung et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.646.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.646.mp4