Self-Attentional Models Application in Task-Oriented Dialogue Generation Systems

Mansour Saffar Mehrjardi, Amine Trabelsi, Osmar R. Zaiane


Abstract
Self-attentional models are a new paradigm for sequence modelling tasks which differ from common sequence modelling methods, such as recurrence-based and convolution-based sequence learning, in the way that their architecture is only based on the attention mechanism. Self-attentional models have been used in the creation of the state-of-the-art models in many NLP task such as neural machine translation, but their usage has not been explored for the task of training end-to-end task-oriented dialogue generation systems yet. In this study, we apply these models on the DSTC2 dataset for training task-oriented chatbots. Our finding shows that self-attentional models can be exploited to create end-to-end task-oriented chatbots which not only achieve higher evaluation scores compared to recurrence-based models, but also do so more efficiently.
Anthology ID:
R19-1119
Volume:
Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2019)
Month:
September
Year:
2019
Address:
Varna, Bulgaria
Editors:
Ruslan Mitkov, Galia Angelova
Venue:
RANLP
SIG:
Publisher:
INCOMA Ltd.
Note:
Pages:
1031–1040
Language:
URL:
https://aclanthology.org/R19-1119
DOI:
10.26615/978-954-452-056-4_119
Bibkey:
Cite (ACL):
Mansour Saffar Mehrjardi, Amine Trabelsi, and Osmar R. Zaiane. 2019. Self-Attentional Models Application in Task-Oriented Dialogue Generation Systems. In Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2019), pages 1031–1040, Varna, Bulgaria. INCOMA Ltd..
Cite (Informal):
Self-Attentional Models Application in Task-Oriented Dialogue Generation Systems (Saffar Mehrjardi et al., RANLP 2019)
Copy Citation:
PDF:
https://aclanthology.org/R19-1119.pdf