Okay, Let’s Do This! Modeling Event Coreference with Generated Rationales and Knowledge Distillation

Abhijnan Nath, Shadi Manafi Avari, Avyakta Chelle, Nikhil Krishnaswamy


Abstract
In NLP, Event Coreference Resolution (ECR) is the task of connecting event clusters that refer to the same underlying real-life event, usually via neural systems. In this work, we investigate using abductive free-text rationales (FTRs) generated by modern autoregressive LLMs as distant supervision of smaller student models for cross-document coreference (CDCR) of events. We implement novel rationale-oriented event clustering and knowledge distillation methods for event coreference scoring that leverage enriched information from the FTRs for improved CDCR without additional annotation or expensive document clustering. Our model using coreference-specific knowledge distillation achieves SOTA B3 F1 on the ECB+ and GVC corpora and we establish a new baseline on the AIDA Phase 1 corpus. Our code can be found at https://github.com/csu-signal/llama_cdcr.
Anthology ID:
2024.naacl-long.218
Volume:
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3931–3946
Language:
URL:
https://aclanthology.org/2024.naacl-long.218
DOI:
Bibkey:
Cite (ACL):
Abhijnan Nath, Shadi Manafi Avari, Avyakta Chelle, and Nikhil Krishnaswamy. 2024. Okay, Let’s Do This! Modeling Event Coreference with Generated Rationales and Knowledge Distillation. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), pages 3931–3946, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
Okay, Let’s Do This! Modeling Event Coreference with Generated Rationales and Knowledge Distillation (Nath et al., NAACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.naacl-long.218.pdf
Copyright:
 2024.naacl-long.218.copyright.pdf