Dynamic Global Memory for Document-level Argument Extraction

Xinya Du, Sha Li, Heng Ji


Abstract
Extracting informative arguments of events from news articles is a challenging problem in information extraction, which requires a global contextual understanding of each document. While recent work on document-level extraction has gone beyond single-sentence and increased the cross-sentence inference capability of end-to-end models, they are still restricted by certain input sequence length constraints and usually ignore the global context between events. To tackle this issue, we introduce a new global neural generation-based framework for document-level event argument extraction by constructing a document memory store to record the contextual event information and leveraging it to implicitly and explicitly help with decoding of arguments for later events. Empirical results show that our framework outperforms prior methods substantially and it is more robust to adversarially annotated examples with our constrained decoding design.
Anthology ID:
2022.acl-long.361
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5264–5275
Language:
URL:
https://aclanthology.org/2022.acl-long.361
DOI:
10.18653/v1/2022.acl-long.361
Bibkey:
Cite (ACL):
Xinya Du, Sha Li, and Heng Ji. 2022. Dynamic Global Memory for Document-level Argument Extraction. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 5264–5275, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Dynamic Global Memory for Document-level Argument Extraction (Du et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.361.pdf
Software:
 2022.acl-long.361.software.zip
Code
 xinyadu/memory_docie