StructSP: Efficient Fine-tuning of Task-Oriented Dialog System by Using Structure-aware Boosting and Grammar Constraints

Truong Do, Phuong Nguyen, Minh Nguyen


Abstract
We have investigated methods utilizing hierarchical structure information representation in the semantic parsing task and have devised a method that reinforces the semantic awareness of a pre-trained language model via a two-step fine-tuning mechanism: hierarchical structure information strengthening and a final specific task. The model used is better than existing ones at learning the contextual representations of utterances embedded within its hierarchical semantic structure and thereby improves system performance. In addition, we created a mechanism using inductive grammar to dynamically prune the unpromising directions in the semantic structure parsing process. Finally, through experimentsOur code will be published when this paper is accepted. on the TOP and TOPv2 (low-resource setting) datasets, we achieved state-of-the-art (SOTA) performance, confirming the effectiveness of our proposed model.
Anthology ID:
2023.findings-acl.648
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10206–10220
Language:
URL:
https://aclanthology.org/2023.findings-acl.648
DOI:
10.18653/v1/2023.findings-acl.648
Bibkey:
Cite (ACL):
Truong Do, Phuong Nguyen, and Minh Nguyen. 2023. StructSP: Efficient Fine-tuning of Task-Oriented Dialog System by Using Structure-aware Boosting and Grammar Constraints. In Findings of the Association for Computational Linguistics: ACL 2023, pages 10206–10220, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
StructSP: Efficient Fine-tuning of Task-Oriented Dialog System by Using Structure-aware Boosting and Grammar Constraints (Do et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.648.pdf
Video:
 https://aclanthology.org/2023.findings-acl.648.mp4