Understand before Answer: Improve Temporal Reading Comprehension via Precise Question Understanding

Hao Huang, Xiubo Geng, Guodong Long, Daxin Jiang


Abstract
This work studies temporal reading comprehension (TRC), which reads a free-text passage and answers temporal ordering questions. Precise question understanding is critical for temporal reading comprehension. For example, the question “What happened before the victory” and “What happened after the victory” share almost all words except one, while their answers are totally different. Moreover, even if two questions query about similar temporal relations, different varieties might also lead to various answers. For example, although both the question “What usually happened during the press release?” and “What might happen during the press release” query events which happen after “the press release”, they convey divergent semantics. To this end, we propose a novel reading comprehension approach with precise question understanding. Specifically, a temporal ordering question is embedded into two vectors to capture the referred event and the temporal relation. Then we evaluate the temporal relation between candidate events and the referred event based on that. Such fine-grained representations offer two benefits. First, it enables a better understanding of the question by focusing on different elements of a question. Second, it provides good interpretability when evaluating temporal relations. Furthermore, we also harness an auxiliary contrastive loss for representation learning of temporal relations, which aims to distinguish relations with subtle but critical changes. The proposed approach outperforms strong baselines and achieves state-of-the-art performance on the TORQUE dataset. It also increases the accuracy of four pre-trained language models (BERT base, BERT large, RoBERTa base, and RoBETRa large), demonstrating its generic effectiveness on divergent models.
Anthology ID:
2022.naacl-main.28
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
375–384
Language:
URL:
https://aclanthology.org/2022.naacl-main.28
DOI:
10.18653/v1/2022.naacl-main.28
Bibkey:
Cite (ACL):
Hao Huang, Xiubo Geng, Guodong Long, and Daxin Jiang. 2022. Understand before Answer: Improve Temporal Reading Comprehension via Precise Question Understanding. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 375–384, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Understand before Answer: Improve Temporal Reading Comprehension via Precise Question Understanding (Huang et al., NAACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.naacl-main.28.pdf
Video:
 https://aclanthology.org/2022.naacl-main.28.mp4
Data
DROPTorque