Topic Spotting using Hierarchical Networks with Self Attention

Pooja Chitkara, Ashutosh Modi, Pravalika Avvaru, Sepehr Janghorbani, Mubbasir Kapadia


Abstract
Success of deep learning techniques have renewed the interest in development of dialogue systems. However, current systems struggle to have consistent long term conversations with the users and fail to build rapport. Topic spotting, the task of automatically inferring the topic of a conversation, has been shown to be helpful in making dialog system more engaging and efficient. We propose a hierarchical model with self attention for topic spotting. Experiments on the Switchboard corpus show the superior performance of our model over previously proposed techniques for topic spotting and deep models for text classification. Additionally, in contrast to offline processing of dialog, we also analyze the performance of our model in a more realistic setting i.e. in an online setting where the topic is identified in real time as the dialog progresses. Results show that our model is able to generalize even with limited information in the online setting.
Anthology ID:
N19-1376
Volume:
Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)
Month:
June
Year:
2019
Address:
Minneapolis, Minnesota
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3755–3761
Language:
URL:
https://aclanthology.org/N19-1376
DOI:
10.18653/v1/N19-1376
Bibkey:
Cite (ACL):
Pooja Chitkara, Ashutosh Modi, Pravalika Avvaru, Sepehr Janghorbani, and Mubbasir Kapadia. 2019. Topic Spotting using Hierarchical Networks with Self Attention. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pages 3755–3761, Minneapolis, Minnesota. Association for Computational Linguistics.
Cite (Informal):
Topic Spotting using Hierarchical Networks with Self Attention (Chitkara et al., NAACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/N19-1376.pdf