A Transformer-based Threshold-Free Framework for Multi-Intent NLU

Lisung Chen, Nuo Chen, Yuexian Zou, Yong Wang, Xinzhong Sun


Abstract
Multi-intent natural language understanding (NLU) has recently gained attention. It detects multiple intents in an utterance, which is better suited to real-world scenarios. However, the state-of-the-art joint NLU models mainly detect multiple intents on threshold-based strategy, resulting in one main issue: the model is extremely sensitive to the threshold settings. In this paper, we propose a transformer-based Threshold-Free Multi-intent NLU model (TFMN) with multi-task learning (MTL). Specifically, we first leverage multiple layers of a transformer-based encoder to generate multi-grain representations. Then we exploit the information of the number of multiple intents in each utterance without additional manual annotations and propose an auxiliary detection task: Intent Number detection (IND). Furthermore, we propose a threshold-free intent multi-intent classifier that utilizes the output of IND task and detects the multiple intents without depending on the threshold. Extensive experiments demonstrate that our proposed model achieves superior results on two public multi-intent datasets.
Anthology ID:
2022.coling-1.629
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Editors:
Nicoletta Calzolari, Chu-Ren Huang, Hansaem Kim, James Pustejovsky, Leo Wanner, Key-Sun Choi, Pum-Mo Ryu, Hsin-Hsi Chen, Lucia Donatelli, Heng Ji, Sadao Kurohashi, Patrizia Paggio, Nianwen Xue, Seokhwan Kim, Younggyun Hahm, Zhong He, Tony Kyungil Lee, Enrico Santus, Francis Bond, Seung-Hoon Na
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
7187–7192
Language:
URL:
https://aclanthology.org/2022.coling-1.629
DOI:
Bibkey:
Cite (ACL):
Lisung Chen, Nuo Chen, Yuexian Zou, Yong Wang, and Xinzhong Sun. 2022. A Transformer-based Threshold-Free Framework for Multi-Intent NLU. In Proceedings of the 29th International Conference on Computational Linguistics, pages 7187–7192, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
A Transformer-based Threshold-Free Framework for Multi-Intent NLU (Chen et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.629.pdf
Data
ATISMixATISMixSNIPSSNIPS