Zero-Shot On-the-Fly Event Schema Induction

Rotem Dror, Haoyu Wang, Dan Roth


Abstract
What are the events involved in a pandemic outbreak? What steps should be taken when planning a wedding? The answers to these questions can be found by collecting many documents on the complex event of interest, extracting relevant information, and analyzing it. We present a new approach in which large language models are utilized to generate source documents that allow predicting, given a high-level event definition, the specific events, arguments, and relations between them to construct a schema that describes the complex event in its entirety. Using our model, complete schemas on any topic can be generated on-the-fly without any manual data collection, i.e., in a zero-shot manner. Moreover, we develop efficient methods to extract pertinent information from texts and demonstrate in a series of experiments that these schemas are considered to be more complete than human-curated ones in the majority of examined scenarios. Finally, we show that this framework is comparable in performance with previous supervised schema induction methods that rely on collecting real texts and even reaching the best score in the prediction task.
Anthology ID:
2023.findings-eacl.53
Volume:
Findings of the Association for Computational Linguistics: EACL 2023
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
705–725
Language:
URL:
https://aclanthology.org/2023.findings-eacl.53
DOI:
10.18653/v1/2023.findings-eacl.53
Bibkey:
Cite (ACL):
Rotem Dror, Haoyu Wang, and Dan Roth. 2023. Zero-Shot On-the-Fly Event Schema Induction. In Findings of the Association for Computational Linguistics: EACL 2023, pages 705–725, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Zero-Shot On-the-Fly Event Schema Induction (Dror et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-eacl.53.pdf
Video:
 https://aclanthology.org/2023.findings-eacl.53.mp4