AISFG: Abundant Information Slot Filling Generator

Yang Yan, Junda Ye, Zhongbao Zhang, Liwen Wang


Abstract
As an essential component of task-oriented dialogue systems, slot filling requires enormous labeled training data in a certain domain. However, in most cases, there is little or no target domain training data is available in the training stage. Thus, cross-domain slot filling has to cope with the data scarcity problem by zero/few-shot learning. Previous researches on zero/few-shot cross-domain slot filling focus on slot descriptions and examples while ignoring the slot type ambiguity and example ambiguity issues. To address these problems, we propose Abundant Information Slot Filling Generator (AISFG), a generative model with a novel query template that incorporates domain descriptions, slot descriptions, and examples with context. Experimental results show that our model outperforms state-of-the-art approaches in zero/few-shot slot filling task.
Anthology ID:
2022.naacl-main.308
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
July
Year:
2022
Address:
Seattle, United States
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4180–4187
Language:
URL:
https://aclanthology.org/2022.naacl-main.308
DOI:
10.18653/v1/2022.naacl-main.308
Bibkey:
Cite (ACL):
Yang Yan, Junda Ye, Zhongbao Zhang, and Liwen Wang. 2022. AISFG: Abundant Information Slot Filling Generator. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 4180–4187, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
AISFG: Abundant Information Slot Filling Generator (Yan et al., NAACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.naacl-main.308.pdf