LC4EE: LLMs as Good Corrector for Event Extraction

Mengna Zhu, Kaisheng Zeng, JibingWu JibingWu, Lihua Liu, Hongbin Huang, Lei Hou, Juanzi Li


Abstract
Event extraction (EE) is a critical task in natural language processing, yet deploying a practical EE system remains challenging. On one hand, powerful large language models (LLMs) currently show poor performance because EE task is more complex than other tasks. On the other hand, state-of-the-art (SOTA) small language models (SLMs) for EE tasks are typically developed through fine-tuning, lack flexibility, and have considerable room for improvement. We propose an approach, **L**LMs-as-**C**orrector for **E**vent **E**xtraction (**LC4EE**), aiming to leverage the superior extraction capability of SLMs and the instruction-following ability of LLMs to construct a robust and highly available EE system. By utilizing LLMs to identify and correct errors of SLMs predictions based on automatically generated feedback information, EE performances can be improved significantly. Experimental results on the representative datasets ACE2005 and MAVEN-Arg for Event Detection (ED) and EE tasks validated the effectiveness of our method.
Anthology ID:
2024.findings-acl.715
Volume:
Findings of the Association for Computational Linguistics: ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12028–12038
Language:
URL:
https://aclanthology.org/2024.findings-acl.715
DOI:
10.18653/v1/2024.findings-acl.715
Bibkey:
Cite (ACL):
Mengna Zhu, Kaisheng Zeng, JibingWu JibingWu, Lihua Liu, Hongbin Huang, Lei Hou, and Juanzi Li. 2024. LC4EE: LLMs as Good Corrector for Event Extraction. In Findings of the Association for Computational Linguistics: ACL 2024, pages 12028–12038, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
LC4EE: LLMs as Good Corrector for Event Extraction (Zhu et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-acl.715.pdf