%0 Conference Proceedings %T FiE: Building a Global Probability Space by Leveraging Early Fusion in Encoder for Open-Domain Question Answering %A Kedia, Akhil %A Zaidi, Mohd Abbas %A Lee, Haejun %Y Goldberg, Yoav %Y Kozareva, Zornitsa %Y Zhang, Yue %S Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing %D 2022 %8 December %I Association for Computational Linguistics %C Abu Dhabi, United Arab Emirates %F kedia-etal-2022-fie %X Generative models have recently started to outperform extractive models in Open Domain Question Answering, largely by leveraging their decoder to attend over multiple encoded passages and combining their information. However, generative models tend to be larger than extractive models due to the need for a decoder, run slower during inference due to auto-regressive decoder beam search, and their generated output often suffers from hallucinations. We propose to extend transformer encoders with the ability to fuse information from multiple passages, using global representation to provide cross-sample attention over all tokens across samples. Furthermore, we propose an alternative answer span probability calculation to better aggregate answer scores in the global space of all samples. Using our proposed method, we outperform the current state-of-the-art method by 2.5 Exact Match score on the Natural Question dataset while using only 25% of parameters and 35% of the latency during inference, and 4.4 Exact Match on WebQuestions dataset. When coupled with synthetic data augmentation, we outperform larger models on the TriviaQA dataset as well. The latency and parameter savings of our method make it particularly attractive for open-domain question answering, as these models are often compute-intensive. %R 10.18653/v1/2022.emnlp-main.285 %U https://aclanthology.org/2022.emnlp-main.285 %U https://doi.org/10.18653/v1/2022.emnlp-main.285 %P 4246-4260