Uncertainty Guided Global Memory Improves Multi-Hop Question Answering

Alsu Sagirova, Mikhail Burtsev


Abstract
Transformers have become the gold standard for many natural language processing tasks and, in particular, for multi-hop question answering (MHQA). This task includes processing a long document and reasoning over the multiple parts of it. The landscape of MHQA approaches can be classified into two primary categories. The first group focuses on extracting supporting evidence, thereby constraining the QA model’s context to predicted facts. Conversely, the second group relies on the attention mechanism of the long input encoding model to facilitate multi-hop reasoning. However, attention-based token representations lack explicit global contextual information to connect reasoning steps. To address these issues, we propose GEMFormer, a two-stage method that first collects relevant information over the entire document to the memory and then combines it with local context to solve the task. Our experimental results show that fine-tuning a pre-trained model with memory-augmented input, including the most certain global elements, improves the model’s performance on three MHQA datasets compared to the baseline. We also found that the global explicit memory contains information from supporting facts required for the correct answer.
Anthology ID:
2023.emnlp-main.262
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4317–4328
Language:
URL:
https://aclanthology.org/2023.emnlp-main.262
DOI:
10.18653/v1/2023.emnlp-main.262
Bibkey:
Cite (ACL):
Alsu Sagirova and Mikhail Burtsev. 2023. Uncertainty Guided Global Memory Improves Multi-Hop Question Answering. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 4317–4328, Singapore. Association for Computational Linguistics.
Cite (Informal):
Uncertainty Guided Global Memory Improves Multi-Hop Question Answering (Sagirova & Burtsev, EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.262.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.262.mp4