Going beyond research datasets: Novel intent discovery in the industry setting

Aleksandra Chrabrowa, Tsimur Hadeliya, Dariusz Kajtoch, Robert Mroczkowski, Piotr Rybak


Abstract
Novel intent discovery automates the process of grouping similar messages (questions) to identify previously unknown intents. However, current research focuses on publicly available datasets which have only the question field and significantly differ from real-life datasets. This paper proposes methods to improve the intent discovery pipeline deployed in a large e-commerce platform. We show the benefit of pre-training language models on in-domain data: both self-supervised and with weak supervision. We also devise the best method to utilize the conversational structure (i.e., question and answer) of real-life datasets during fine-tuning for clustering tasks, which we call Conv. All our methods combined to fully utilize real-life datasets give up to 33pp performance boost over state-of-the-art Constrained Deep Adaptive Clustering (CDAC) model for question only. By comparison CDAC model for the question data only gives only up to 13pp performance boost over the naive baseline.
Anthology ID:
2023.findings-eacl.68
Volume:
Findings of the Association for Computational Linguistics: EACL 2023
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
925–941
Language:
URL:
https://aclanthology.org/2023.findings-eacl.68
DOI:
10.18653/v1/2023.findings-eacl.68
Bibkey:
Cite (ACL):
Aleksandra Chrabrowa, Tsimur Hadeliya, Dariusz Kajtoch, Robert Mroczkowski, and Piotr Rybak. 2023. Going beyond research datasets: Novel intent discovery in the industry setting. In Findings of the Association for Computational Linguistics: EACL 2023, pages 925–941, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Going beyond research datasets: Novel intent discovery in the industry setting (Chrabrowa et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-eacl.68.pdf
Video:
 https://aclanthology.org/2023.findings-eacl.68.mp4