Complex Event Schema Induction with Knowledge-Enriched Diffusion Model

Yupu Hao, Pengfei Cao, Yubo Chen, Kang Liu, Jiexin Xu, Huaijun Li, Xiaojian Jiang, Jun Zhao


Abstract
The concept of a complex event schema pertains to the graph structure that represents real-world knowledge of events and their multi-dimensional relationships. However, previous studies on event schema induction have been hindered by challenges such as error propagation and data quality issues. To tackle these challenges, we propose a knowledge-enriched discrete diffusion model. Specifically, we distill the abundant event scenario knowledge of Large Language Models (LLMs) through an object-oriented Python style prompt. We incorporate this knowledge into the training data, enhancing its quality. Subsequently, we employ a discrete diffusion process to generate all nodes and links simultaneously in a non-auto-regressive manner to tackle the problem of error propagation. Additionally, we devise an entity relationship prediction module to complete entity relationships between event arguments. Experimental results demonstrate that our approach achieves outstanding performance across a range of evaluation metrics.
Anthology ID:
2023.findings-emnlp.319
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4809–4825
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.319
DOI:
10.18653/v1/2023.findings-emnlp.319
Bibkey:
Cite (ACL):
Yupu Hao, Pengfei Cao, Yubo Chen, Kang Liu, Jiexin Xu, Huaijun Li, Xiaojian Jiang, and Jun Zhao. 2023. Complex Event Schema Induction with Knowledge-Enriched Diffusion Model. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 4809–4825, Singapore. Association for Computational Linguistics.
Cite (Informal):
Complex Event Schema Induction with Knowledge-Enriched Diffusion Model (Hao et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.319.pdf