Towards Efficient Dialogue Processing in the Emergency Response Domain

Tatiana Anikina


Abstract
In this paper we describe the task of adapting NLP models to dialogue processing in the emergency response domain. Our goal is to provide a recipe for building a system that performs dialogue act classification and domain-specific slot tagging while being efficient, flexible and robust. We show that adapter models Pfeiffer et al. (2020) perform well in the emergency response domain and benefit from additional dialogue context and speaker information. Comparing adapters to standard fine-tuned Transformer models we show that they achieve competitive results and can easily accommodate new tasks without significant memory increase since the base model can be shared between the adapters specializing on different tasks. We also address the problem of scarce annotations in the emergency response domain and evaluate different data augmentation techniques in a low-resource setting.
Anthology ID:
2023.acl-srw.31
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 4: Student Research Workshop)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Vishakh Padmakumar, Gisela Vallejo, Yao Fu
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
212–225
Language:
URL:
https://aclanthology.org/2023.acl-srw.31
DOI:
10.18653/v1/2023.acl-srw.31
Bibkey:
Cite (ACL):
Tatiana Anikina. 2023. Towards Efficient Dialogue Processing in the Emergency Response Domain. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 4: Student Research Workshop), pages 212–225, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Towards Efficient Dialogue Processing in the Emergency Response Domain (Anikina, ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-srw.31.pdf