Attention Neural Model for Temporal Relation Extraction

Sijia Liu, Liwei Wang, Vipin Chaudhary, Hongfang Liu


Abstract
Neural network models have shown promise in the temporal relation extraction task. In this paper, we present the attention based neural network model to extract the containment relations within sentences from clinical narratives. The attention mechanism used on top of GRU model outperforms the existing state-of-the-art neural network models on THYME corpus in intra-sentence temporal relation extraction.
Anthology ID:
W19-1917
Volume:
Proceedings of the 2nd Clinical Natural Language Processing Workshop
Month:
June
Year:
2019
Address:
Minneapolis, Minnesota, USA
Editors:
Anna Rumshisky, Kirk Roberts, Steven Bethard, Tristan Naumann
Venue:
ClinicalNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
134–139
Language:
URL:
https://aclanthology.org/W19-1917
DOI:
10.18653/v1/W19-1917
Bibkey:
Cite (ACL):
Sijia Liu, Liwei Wang, Vipin Chaudhary, and Hongfang Liu. 2019. Attention Neural Model for Temporal Relation Extraction. In Proceedings of the 2nd Clinical Natural Language Processing Workshop, pages 134–139, Minneapolis, Minnesota, USA. Association for Computational Linguistics.
Cite (Informal):
Attention Neural Model for Temporal Relation Extraction (Liu et al., ClinicalNLP 2019)
Copy Citation:
PDF:
https://aclanthology.org/W19-1917.pdf