Shudong Lu
2024
TRUE-UIE: Two Universal Relations Unify Information Extraction Tasks
Yucheng Wang
|
Bowen Yu
|
Yilin Liu
|
Shudong Lu
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)
Information extraction (IE) encounters challenges due to the variety of schemas and objectives that differ across tasks. Recent advancements hint at the potential for universal approaches to model such tasks, referred to as Universal Information Extraction (UIE). While handling diverse tasks in one model, their generalization is limited since they are actually learning task-specific knowledge.In this study, we introduce an innovative paradigm known as TRUE-UIE, wherein all IE tasks are aligned to learn the same goals: extracting mention spans and two universal relations named \mathtt{NEXT} and \mathtt{IS}. During the decoding process, the \mathtt{NEXT} relation is utilized to group related elements, while the \mathtt{IS} relation, in conjunction with structured language prompts, undertakes the role of type recognition. Additionally, we consider the sequential dependency of tokens during span extraction, an aspect often overlooked in prevalent models.Our empirical experiments indicate that TRUE-UIE achieves state-of-the-art performance on established benchmarks encompassing 16 datasets, spanning 7 diverse IE tasks. Further evaluations reveal that our approach effectively share knowledge between different IE tasks, showcasing significant transferability in zero-shot and few-shot scenarios.
2023
DemoSG: Demonstration-enhanced Schema-guided Generation for Low-resource Event Extraction
Gang Zhao
|
Xiaocheng Gong
|
Xinjie Yang
|
Guanting Dong
|
Shudong Lu
|
Si Li
Findings of the Association for Computational Linguistics: EMNLP 2023
Most current Event Extraction (EE) methods focus on the high-resource scenario, which requires a large amount of annotated data and can hardly be applied to low-resource domains. To address EE more effectively with limited resources, we propose the Demonstration-enhanced Schema-guided Generation (DemoSG) model, which benefits low-resource EE from two aspects: Firstly, we propose the demonstration-based learning paradigm for EE to fully use the annotated data, which transforms them into demonstrations to illustrate the extraction process and help the model learn effectively. Secondly, we formulate EE as a natural language generation task guided by schema-based prompts, thereby leveraging label semantics and promoting knowledge transfer in low-resource scenarios. We conduct extensive experiments under in-domain and domain adaptation low-resource settings on three datasets, and study the robustness of DemoSG. The results show that DemoSG significantly outperforms current methods in low-resource scenarios.
Search
Co-authors
- Gang Zhao 1
- Xiaocheng Gong 1
- Xinjie Yang 1
- Guanting Dong 1
- Si Li 1
- show all...