Neural Language Modeling for Contextualized Temporal Graph Generation

Aman Madaan, Yiming Yang


Abstract
This paper presents the first study on using large-scale pre-trained language models for automated generation of an event-level temporal graph for a document. Despite the huge success of neural pre-training methods in NLP tasks, its potential for temporal reasoning over event graphs has not been sufficiently explored. Part of the reason is the difficulty in obtaining large training corpora with human-annotated events and temporal links. We address this challenge by using existing IE/NLP tools to automatically generate a large quantity (89,000) of system-produced document-graph pairs, and propose a novel formulation of the contextualized graph generation problem as a sequence-to-sequence mapping task. These strategies enable us to leverage and fine-tune pre-trained language models on the system-induced training data for the graph generation task. Our experiments show that our approach is highly effective in generating structurally and semantically valid graphs. Further, evaluation on a challenging hand-labeled, out-of-domain corpus shows that our method outperforms the closest existing method by a large margin on several metrics. We also show a downstream application of our approach by adapting it to answer open-ended temporal questions in a reading comprehension setting.
Anthology ID:
2021.naacl-main.67
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
June
Year:
2021
Address:
Online
Editors:
Kristina Toutanova, Anna Rumshisky, Luke Zettlemoyer, Dilek Hakkani-Tur, Iz Beltagy, Steven Bethard, Ryan Cotterell, Tanmoy Chakraborty, Yichao Zhou
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
864–881
Language:
URL:
https://aclanthology.org/2021.naacl-main.67
DOI:
10.18653/v1/2021.naacl-main.67
Bibkey:
Cite (ACL):
Aman Madaan and Yiming Yang. 2021. Neural Language Modeling for Contextualized Temporal Graph Generation. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 864–881, Online. Association for Computational Linguistics.
Cite (Informal):
Neural Language Modeling for Contextualized Temporal Graph Generation (Madaan & Yang, NAACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.naacl-main.67.pdf
Video:
 https://aclanthology.org/2021.naacl-main.67.mp4
Code
 madaan/temporal-graph-gen