Once Upon a Time in Graph: Relative-Time Pretraining for Complex Temporal Reasoning

Sen Yang, Xin Li, Lidong Bing, Wai Lam


Abstract
Our physical world is constantly evolving over time, rendering challenges for pre-trained language models to understand and reason over the temporal contexts of texts. Existing work focuses on strengthening the direct association between a piece of text and its time-stamp. However, the knowledge-time association is usually insufficient for the downstream tasks that require reasoning over temporal dependencies between knowledge. In this work, we make use of the underlying nature of time, all temporally-scoped sentences are strung together through a one-dimensional time axis, and suggest creating a graph structure based on the relative placements of events along the time axis. Inspired by the graph view, we propose RemeMo ( ̲Relative Ti ̲me  ̲Modeling), which explicitly connects all temporally-scoped facts by modeling the time relations between any two sentences. Experimental results show that RemeMo outperforms the baseline T5 on multiple temporal question answering datasets under various settings. Further analysis suggests that RemeMo is especially good at modeling long-range complex temporal dependencies.
Anthology ID:
2023.emnlp-main.728
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11879–11895
Language:
URL:
https://aclanthology.org/2023.emnlp-main.728
DOI:
10.18653/v1/2023.emnlp-main.728
Bibkey:
Cite (ACL):
Sen Yang, Xin Li, Lidong Bing, and Wai Lam. 2023. Once Upon a Time in Graph: Relative-Time Pretraining for Complex Temporal Reasoning. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 11879–11895, Singapore. Association for Computational Linguistics.
Cite (Informal):
Once Upon a Time in Graph: Relative-Time Pretraining for Complex Temporal Reasoning (Yang et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.728.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.728.mp4