%0 Conference Proceedings %T Scalable Multi-Hop Relational Reasoning for Knowledge-Aware Question Answering %A Feng, Yanlin %A Chen, Xinyue %A Lin, Bill Yuchen %A Wang, Peifeng %A Yan, Jun %A Ren, Xiang %Y Webber, Bonnie %Y Cohn, Trevor %Y He, Yulan %Y Liu, Yang %S Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP) %D 2020 %8 November %I Association for Computational Linguistics %C Online %F feng-etal-2020-scalable %X Existing work on augmenting question answering (QA) models with external knowledge (e.g., knowledge graphs) either struggle to model multi-hop relations efficiently, or lack transparency into the model’s prediction rationale. In this paper, we propose a novel knowledge-aware approach that equips pre-trained language models (PTLMs) has with a multi-hop relational reasoning module, named multi-hop graph relation network (MHGRN). It performs multi-hop, multi-relational reasoning over subgraphs extracted from external knowledge graphs. The proposed reasoning module unifies path-based reasoning methods and graph neural networks to achieve better interpretability and scalability. We also empirically show its effectiveness and scalability on CommonsenseQA and OpenbookQA datasets, and interpret its behaviors with case studies, with the code for experiments released. %R 10.18653/v1/2020.emnlp-main.99 %U https://aclanthology.org/2020.emnlp-main.99 %U https://doi.org/10.18653/v1/2020.emnlp-main.99 %P 1295-1309