Evidence-Focused Fact Summarization for Knowledge-Augmented Zero-Shot Question Answering

Sungho Ko, Hyunjin Cho, Hyungjoo Chae, Jinyoung Yeo, Dongha Lee


Abstract
Recent studies have investigated utilizing Knowledge Graphs (KGs) to enhance Quesetion Answering (QA) performance of Large Language Models (LLMs), yet structured KG verbalization remains challenging. Existing methods, like concatenation or free-form textual conversion of triples, have limitations, including duplicated entities or relations, reduced evidence density, and failure to highlight crucial evidence. To address these issues, we propose EFSum, an Evidence-focused Fact Summarization framework for enhanced QA with knowledge-augmented LLMs. We optimize an LLM as a fact summarizer through distillation and preference alignment. Our extensive expeirments show that EFSum improves LLM’s zero-shot QA performance with its helpful and faithful summaries, especially when noisy facts are retrieved.
Anthology ID:
2024.emnlp-main.594
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10636–10651
Language:
URL:
https://aclanthology.org/2024.emnlp-main.594
DOI:
Bibkey:
Cite (ACL):
Sungho Ko, Hyunjin Cho, Hyungjoo Chae, Jinyoung Yeo, and Dongha Lee. 2024. Evidence-Focused Fact Summarization for Knowledge-Augmented Zero-Shot Question Answering. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 10636–10651, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Evidence-Focused Fact Summarization for Knowledge-Augmented Zero-Shot Question Answering (Ko et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.594.pdf