Inferring Implicit Relations in Complex Questions with Language Models

Uri Katz, Mor Geva, Jonathan Berant


Abstract
A prominent challenge for modern language understanding systems is the ability to answer implicit reasoning questions, where the required reasoning steps for answering the question are not mentioned in the text explicitly. In this work, we investigate why current models struggle with implicit reasoning question answering (QA) tasks, by decoupling inference of reasoning steps from their execution. We define a new task of implicit relation inference and construct a benchmark, IMPLICITRELATIONS, where given a question, a model should output a list of concept-relation pairs, where the relations describe the implicit reasoning steps required for answering the question. Using IMPLICITRELATIONS, we evaluate models from the GPT-3 family and find that, while these models struggle on the implicit reasoning QA task, they often succeed at inferring implicit relations. This suggests that the challenge in implicit reasoning questions does not stem from the need to plan a reasoning strategy alone, but to do it while also retrieving and reasoning over relevant information.
Anthology ID:
2022.findings-emnlp.188
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2548–2566
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.188
DOI:
10.18653/v1/2022.findings-emnlp.188
Bibkey:
Cite (ACL):
Uri Katz, Mor Geva, and Jonathan Berant. 2022. Inferring Implicit Relations in Complex Questions with Language Models. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 2548–2566, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Inferring Implicit Relations in Complex Questions with Language Models (Katz et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-emnlp.188.pdf