LARA: Linguistic-Adaptive Retrieval-Augmentation for Multi-Turn Intent Classification

Junhua Liu, Tan Yong Keat, Bin Fu, Kwan Hui Lim


Abstract
Multi-turn intent classification is notably challenging due to the complexity and evolving nature of conversational contexts. This paper introduces LARA, a Linguistic-Adaptive Retrieval-Augmentation framework to enhance accuracy in multi-turn classification tasks across six languages, accommodating numerous intents in chatbot interactions. LARA combines a fine-tuned smaller model with a retrieval-augmented mechanism, integrated within the architecture of LLMs. The integration allows LARA to dynamically utilize past dialogues and relevant intents, thereby improving the understanding of the context. Furthermore, our adaptive retrieval techniques bolster the cross-lingual capabilities of LLMs without extensive retraining and fine-tuning. Comprehensive experiments demonstrate that LARA achieves state-of-the-art performance on multi-turn intent classification tasks, enhancing the average accuracy by 3.67% from state-of-the-art single-turn intent classifiers.
Anthology ID:
2024.emnlp-industry.82
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing: Industry Track
Month:
November
Year:
2024
Address:
Miami, Florida, US
Editors:
Franck Dernoncourt, Daniel Preoţiuc-Pietro, Anastasia Shimorina
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1096–1106
Language:
URL:
https://aclanthology.org/2024.emnlp-industry.82
DOI:
Bibkey:
Cite (ACL):
Junhua Liu, Tan Yong Keat, Bin Fu, and Kwan Hui Lim. 2024. LARA: Linguistic-Adaptive Retrieval-Augmentation for Multi-Turn Intent Classification. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing: Industry Track, pages 1096–1106, Miami, Florida, US. Association for Computational Linguistics.
Cite (Informal):
LARA: Linguistic-Adaptive Retrieval-Augmentation for Multi-Turn Intent Classification (Liu et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-industry.82.pdf