Jonathon Cai


2020

pdf bib
Exploring Contextualized Neural Language Models for Temporal Dependency Parsing
Hayley Ross | Jonathon Cai | Bonan Min
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)

Extracting temporal relations between events and time expressions has many applications such as constructing event timelines and time-related question answering. It is a challenging problem which requires syntactic and semantic information at sentence or discourse levels, which may be captured by deep contextualized language models (LMs) such as BERT (Devlin et al., 2019). In this paper, we develop several variants of BERT-based temporal dependency parser, and show that BERT significantly improves temporal dependency parsing (Zhang and Xue, 2018a). We also present a detailed analysis on why deep contextualized neural LMs help and where they may fall short. Source code and resources are made available at https://github.com/bnmin/tdp_ranking.
Search
Co-authors
Venues