Enriched Pre-trained Transformers for Joint Slot Filling and Intent Detection

Momchil Hardalov, Ivan Koychev, Preslav Nakov


Abstract
Detecting the user’s intent and finding the corresponding slots among the utterance’s words are important tasks in natural language understanding. Their interconnected nature makes their joint modeling a standard part of training such models. Moreover, data scarceness and specialized vocabularies pose additional challenges. Recently, the advances in pre-trained language models, namely contextualized models such as ELMo and BERT have revolutionized the field by tapping the potential of training very large models with just a few steps of fine-tuning on a task-specific dataset. Here, we leverage such models, and we design a novel architecture on top of them. Moreover, we propose an intent pooling attention mechanism, and we reinforce the slot filling task by fusing intent distributions, word features, and token representations. The experimental results on standard datasets show that our model outperforms both the current non-BERT state of the art as well as stronger BERT-based baselines.
Anthology ID:
2023.ranlp-1.54
Volume:
Proceedings of the 14th International Conference on Recent Advances in Natural Language Processing
Month:
September
Year:
2023
Address:
Varna, Bulgaria
Editors:
Ruslan Mitkov, Galia Angelova
Venue:
RANLP
SIG:
Publisher:
INCOMA Ltd., Shoumen, Bulgaria
Note:
Pages:
480–493
Language:
URL:
https://aclanthology.org/2023.ranlp-1.54
DOI:
Bibkey:
Cite (ACL):
Momchil Hardalov, Ivan Koychev, and Preslav Nakov. 2023. Enriched Pre-trained Transformers for Joint Slot Filling and Intent Detection. In Proceedings of the 14th International Conference on Recent Advances in Natural Language Processing, pages 480–493, Varna, Bulgaria. INCOMA Ltd., Shoumen, Bulgaria.
Cite (Informal):
Enriched Pre-trained Transformers for Joint Slot Filling and Intent Detection (Hardalov et al., RANLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.ranlp-1.54.pdf