%0 Conference Proceedings %T Knowledge Transfer between Structured and Unstructured Sources for Complex Question Answering %A Mo, Lingbo %A Wang, Zhen %A Zhao, Jie %A Sun, Huan %Y Chen, Wenhu %Y Chen, Xinyun %Y Chen, Zhiyu %Y Yao, Ziyu %Y Yasunaga, Michihiro %Y Yu, Tao %Y Zhang, Rui %S Proceedings of the Workshop on Structured and Unstructured Knowledge Integration (SUKI) %D 2022 %8 July %I Association for Computational Linguistics %C Seattle, USA %F mo-etal-2022-knowledge %X Multi-hop question answering (QA) combines multiple pieces of evidence to search for the correct answer. Reasoning over a text corpus (TextQA) and/or a knowledge base (KBQA) has been extensively studied and led to distinct system architectures. However, knowledge transfer between such two QA systems has been under-explored. Research questions like what knowledge is transferred or whether the transferred knowledge can help answer over one source using another one, are yet to be answered. In this paper, therefore, we study the knowledge transfer of multi-hop reasoning between structured and unstructured sources. We first propose a unified QA framework named SimultQA to enable knowledge transfer and bridge the distinct supervisions from KB and text sources. Then, we conduct extensive analyses to explore how knowledge is transferred by leveraging the pre-training and fine-tuning paradigm. We focus on the low-resource fine-tuning to show that pre-training SimultQA on one source can substantially improve its performance on the other source. More fine-grained analyses on transfer behaviors reveal the types of transferred knowledge and transfer patterns. We conclude with insights into how to construct better QA datasets and systems to exploit knowledge transfer for future work. %R 10.18653/v1/2022.suki-1.7 %U https://aclanthology.org/2022.suki-1.7 %U https://doi.org/10.18653/v1/2022.suki-1.7 %P 55-66