CReTIHC: Designing Causal Reasoning Tasks about Temporal Interventions and Hallucinated Confoundings

Changwoo Chun, SongEun Lee, Jaehyung Seo, Heuiseok Lim


Abstract
Large language models (LLMs) have demonstrated impressive capabilities in natural language processing. However, their ability to establish causal relationships, particularly in the context of temporal interventions and language hallucinations, remains challenging. This paper presents CReTIHC, a novel dataset designed to test and enhance the causal reasoning abilities of LLMs. The dataset is constructed using a unique approach that incorporates elements of verbal hallucinations and temporal interventions through the reengineering of existing causal inference datasets. This transformation creates complex scenarios that push LLMs to critically evaluate the information presented and identify cause-and-effect relationships. The CReTIHC dataset serves as a pioneering tool for improving LLM’s causal inference capabilities, paving the way for a more nuanced understanding of causal relationships in natural language processing (NLP) tasks. The whole dataset is publicly accessible at: (https://github.com/ChangwooChun/CReTIHC)
Anthology ID:
2023.findings-emnlp.693
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10334–10343
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.693
DOI:
10.18653/v1/2023.findings-emnlp.693
Bibkey:
Cite (ACL):
Changwoo Chun, SongEun Lee, Jaehyung Seo, and Heuiseok Lim. 2023. CReTIHC: Designing Causal Reasoning Tasks about Temporal Interventions and Hallucinated Confoundings. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 10334–10343, Singapore. Association for Computational Linguistics.
Cite (Informal):
CReTIHC: Designing Causal Reasoning Tasks about Temporal Interventions and Hallucinated Confoundings (Chun et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.693.pdf