%0 Conference Proceedings %T ReasonBERT: Pre-trained to Reason with Distant Supervision %A Deng, Xiang %A Su, Yu %A Lees, Alyssa %A Wu, You %A Yu, Cong %A Sun, Huan %Y Moens, Marie-Francine %Y Huang, Xuanjing %Y Specia, Lucia %Y Yih, Scott Wen-tau %S Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing %D 2021 %8 November %I Association for Computational Linguistics %C Online and Punta Cana, Dominican Republic %F deng-etal-2021-reasonbert %X We present ReasonBert, a pre-training method that augments language models with the ability to reason over long-range relations and multiple, possibly hybrid contexts. Unlike existing pre-training methods that only harvest learning signals from local contexts of naturally occurring texts, we propose a generalized notion of distant supervision to automatically connect multiple pieces of text and tables to create pre-training examples that require long-range reasoning. Different types of reasoning are simulated, including intersecting multiple pieces of evidence, bridging from one piece of evidence to another, and detecting unanswerable cases. We conduct a comprehensive evaluation on a variety of extractive question answering datasets ranging from single-hop to multi-hop and from text-only to table-only to hybrid that require various reasoning capabilities and show that ReasonBert achieves remarkable improvement over an array of strong baselines. Few-shot experiments further demonstrate that our pre-training method substantially improves sample efficiency. %R 10.18653/v1/2021.emnlp-main.494 %U https://aclanthology.org/2021.emnlp-main.494 %U https://doi.org/10.18653/v1/2021.emnlp-main.494 %P 6112-6127