Few-Shot Data Synthesis for Open Domain Multi-Hop Question Answering

Mingda Chen, Xilun Chen, Wen-tau Yih


Abstract
Few-shot learning for open domain multi-hop question answering typically relies on the in-context learning capability of large language models (LLMs). While powerful, these LLMs usually contain tens or hundreds of billions of parameters, making them rather inefficient at inference time. To improve performance of smaller language models, we propose a data synthesis framework for multi-hop question answering that requires less than 10 human-annotated question answer pairs. Our framework depends only on rich, naturally-occurring relationships among documents and is built upon the data generation functions parameterized by LLMs and prompts. We synthesize millions of multi-hop questions and claims to finetune language models, evaluated on popular benchmarks for multi-hop question answering and fact verification. Empirically, our approach improves model performance significantly, allowing the finetuned models to be competitive with GPT-3.5 based approaches while being almost one-third the size in parameter count.
Anthology ID:
2024.eacl-long.12
Volume:
Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
March
Year:
2024
Address:
St. Julian’s, Malta
Editors:
Yvette Graham, Matthew Purver
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
190–208
Language:
URL:
https://aclanthology.org/2024.eacl-long.12
DOI:
Bibkey:
Cite (ACL):
Mingda Chen, Xilun Chen, and Wen-tau Yih. 2024. Few-Shot Data Synthesis for Open Domain Multi-Hop Question Answering. In Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers), pages 190–208, St. Julian’s, Malta. Association for Computational Linguistics.
Cite (Informal):
Few-Shot Data Synthesis for Open Domain Multi-Hop Question Answering (Chen et al., EACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.eacl-long.12.pdf