DS-TOD: Efficient Domain Specialization for Task-Oriented Dialog

Chia-Chien Hung, Anne Lauscher, Simone Ponzetto, Goran Glavaš


Abstract
Recent work has shown that self-supervised dialog-specific pretraining on large conversational datasets yields substantial gains over traditional language modeling (LM) pretraining in downstream task-oriented dialog (TOD). These approaches, however, exploit general dialogic corpora (e.g., Reddit) and thus presumably fail to reliably embed domain-specific knowledge useful for concrete downstream TOD domains. In this work, we investigate the effects of domain specialization of pretrained language models (PLMs) for TOD. Within our DS-TOD framework, we first automatically extract salient domain-specific terms, and then use them to construct DomainCC and DomainReddit – resources that we leverage for domain-specific pretraining, based on (i) masked language modeling (MLM) and (ii) response selection (RS) objectives, respectively. We further propose a resource-efficient and modular domain specialization by means of domain adapters – additional parameter-light layers in which we encode the domain knowledge. Our experiments with prominent TOD tasks – dialog state tracking (DST) and response retrieval (RR) – encompassing five domains from the MultiWOZ benchmark demonstrate the effectiveness of DS-TOD. Moreover, we show that the light-weight adapter-based specialization (1) performs comparably to full fine-tuning in single domain setups and (2) is particularly suitable for multi-domain specialization, where besides advantageous computational footprint, it can offer better TOD performance.
Anthology ID:
2022.findings-acl.72
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
891–904
Language:
URL:
https://aclanthology.org/2022.findings-acl.72
DOI:
10.18653/v1/2022.findings-acl.72
Bibkey:
Cite (ACL):
Chia-Chien Hung, Anne Lauscher, Simone Ponzetto, and Goran Glavaš. 2022. DS-TOD: Efficient Domain Specialization for Task-Oriented Dialog. In Findings of the Association for Computational Linguistics: ACL 2022, pages 891–904, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
DS-TOD: Efficient Domain Specialization for Task-Oriented Dialog (Hung et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-acl.72.pdf
Software:
 2022.findings-acl.72.software.zip
Video:
 https://aclanthology.org/2022.findings-acl.72.mp4
Code
 umanlp/ds-tod
Data
CCNet