%0 Conference Proceedings %T Enhancing Multiple-Choice Question Answering with Causal Knowledge %A Dalal, Dhairya %A Arcan, Mihael %A Buitelaar, Paul %Y Agirre, Eneko %Y Apidianaki, Marianna %Y Vulić, Ivan %S Proceedings of Deep Learning Inside Out (DeeLIO): The 2nd Workshop on Knowledge Extraction and Integration for Deep Learning Architectures %D 2021 %8 June %I Association for Computational Linguistics %C Online %F dalal-etal-2021-enhancing %X The task of causal question answering aims to reason about causes and effects over a provided real or hypothetical premise. Recent approaches have converged on using transformer-based language models to solve question answering tasks. However, pretrained language models often struggle when external knowledge is not present in the premise or when additional context is required to answer the question. To the best of our knowledge, no prior work has explored the efficacy of augmenting pretrained language models with external causal knowledge for multiple-choice causal question answering. In this paper, we present novel strategies for the representation of causal knowledge. Our empirical results demonstrate the efficacy of augmenting pretrained models with external causal knowledge. We show improved performance on the COPA (Choice of Plausible Alternatives) and WIQA (What If Reasoning Over Procedural Text) benchmark tasks. On the WIQA benchmark, our approach is competitive with the state-of-the-art and exceeds it within the evaluation subcategories of In-Paragraph and Out-of-Paragraph perturbations. %R 10.18653/v1/2021.deelio-1.8 %U https://aclanthology.org/2021.deelio-1.8 %U https://doi.org/10.18653/v1/2021.deelio-1.8 %P 70-80