PCMID: Multi-Intent Detection through Supervised Prototypical Contrastive Learning

Yurun Song, Junchen Zhao, Spencer Koehler, Amir Abdullah, Ian Harris


Abstract
Intent detection is a major task in Natural Language Understanding (NLU) and is the component of dialogue systems for interpreting users’ intentions based on their utterances. Many works have explored detecting intents by assuming that each utterance represents only a single intent. Such systems have achieved very good results; however, intent detection is a far more challenging task in typical real-world scenarios, where each user utterance can be highly complex and express multiple intents. Therefore, in this paper, we propose PCMID, a novel Multi-Intent Detection framework enabled by Prototypical Contrastive Learning under a supervised setting. The PCMID model can learn multiple semantic representations of a given user utterance under the context of different intent labels in an optimized semantic space. Our experiments show that PCMID achieves the current state-of-the-art performance on both multiple public benchmark datasets and a private real-world dataset for the multi-intent detection task.
Anthology ID:
2023.findings-emnlp.636
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9481–9495
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.636
DOI:
10.18653/v1/2023.findings-emnlp.636
Bibkey:
Cite (ACL):
Yurun Song, Junchen Zhao, Spencer Koehler, Amir Abdullah, and Ian Harris. 2023. PCMID: Multi-Intent Detection through Supervised Prototypical Contrastive Learning. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 9481–9495, Singapore. Association for Computational Linguistics.
Cite (Informal):
PCMID: Multi-Intent Detection through Supervised Prototypical Contrastive Learning (Song et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.636.pdf