DemoSG: Demonstration-enhanced Schema-guided Generation for Low-resource Event Extraction

Gang Zhao, Xiaocheng Gong, Xinjie Yang, Guanting Dong, Shudong Lu, Si Li


Abstract
Most current Event Extraction (EE) methods focus on the high-resource scenario, which requires a large amount of annotated data and can hardly be applied to low-resource domains. To address EE more effectively with limited resources, we propose the Demonstration-enhanced Schema-guided Generation (DemoSG) model, which benefits low-resource EE from two aspects: Firstly, we propose the demonstration-based learning paradigm for EE to fully use the annotated data, which transforms them into demonstrations to illustrate the extraction process and help the model learn effectively. Secondly, we formulate EE as a natural language generation task guided by schema-based prompts, thereby leveraging label semantics and promoting knowledge transfer in low-resource scenarios. We conduct extensive experiments under in-domain and domain adaptation low-resource settings on three datasets, and study the robustness of DemoSG. The results show that DemoSG significantly outperforms current methods in low-resource scenarios.
Anthology ID:
2023.findings-emnlp.121
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1805–1816
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.121
DOI:
10.18653/v1/2023.findings-emnlp.121
Bibkey:
Cite (ACL):
Gang Zhao, Xiaocheng Gong, Xinjie Yang, Guanting Dong, Shudong Lu, and Si Li. 2023. DemoSG: Demonstration-enhanced Schema-guided Generation for Low-resource Event Extraction. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 1805–1816, Singapore. Association for Computational Linguistics.
Cite (Informal):
DemoSG: Demonstration-enhanced Schema-guided Generation for Low-resource Event Extraction (Zhao et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.121.pdf