Exploring Language Model Generalization in Low-Resource Extractive QA

Saptarshi Sengupta, Wenpeng Yin, Preslav Nakov, Shreya Ghosh, Suhang Wang


Abstract
In this paper, we investigate Extractive Question Answering (EQA) with Large Language Models (LLMs) under domain drift, i.e., can LLMs generalize to domains that require specific knowledge such as medicine and law in a zero-shot fashion without additional in-domain training? To this end, we devise a series of experiments to explain the performance gap empirically. Our findings suggest that: (a) LLMs struggle with dataset demands of closed do- mains such as retrieving long answer spans; (b) Certain LLMs, despite showing strong overall performance, display weaknesses in meeting basic requirements as discriminating between domain-specific senses of words which we link to pre-processing decisions; (c) Scaling model parameters is not always effective for cross-domain generalization; and (d) Closed-domain datasets are quantitatively much different than open-domain EQA datasets and current LLMs struggle to deal with them. Our findings point out important directions for improving existing LLMs.
Anthology ID:
2025.coling-main.474
Volume:
Proceedings of the 31st International Conference on Computational Linguistics
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7106–7126
Language:
URL:
https://aclanthology.org/2025.coling-main.474/
DOI:
Bibkey:
Cite (ACL):
Saptarshi Sengupta, Wenpeng Yin, Preslav Nakov, Shreya Ghosh, and Suhang Wang. 2025. Exploring Language Model Generalization in Low-Resource Extractive QA. In Proceedings of the 31st International Conference on Computational Linguistics, pages 7106–7126, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
Exploring Language Model Generalization in Low-Resource Extractive QA (Sengupta et al., COLING 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.coling-main.474.pdf