Unequal Scientific Recognition in the Age of LLMs

Yixuan Liu, Abel Elekes, Jianglin Lu, Rodrigo Dorantes-Gilardi, Albert-Laszlo Barabasi


Abstract
Large language models (LLMs) are reshaping how scientific knowledge is accessed and represented. This study evaluates the extent to which popular and frontier LLMs including GPT-4o, Claude 3.5 Sonnet, and Gemini 1.5 Pro recognize scientists, benchmarking their outputs against OpenAlex and Wikipedia. Using a dataset focusing on 100,000 physicists from OpenAlex to evaluate LLM recognition, we uncover substantial disparities: LLMs exhibit selective and inconsistent recognition patterns. Recognition correlates strongly with scholarly impact such as citations, and remains uneven across gender and geography. Women researchers, and researchers from Africa, Asia, and Latin America are significantly underrecognized. We further examine the role of training data provenance, identifying Wikipedia as a potential sources that contributes to recognition gaps. Our findings highlight how LLMs can reflect, and potentially amplify existing disparities in science, underscoring the need for more transparent and inclusive knowledge systems.
Anthology ID:
2025.findings-emnlp.1279
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
23558–23568
Language:
URL:
https://aclanthology.org/2025.findings-emnlp.1279/
DOI:
Bibkey:
Cite (ACL):
Yixuan Liu, Abel Elekes, Jianglin Lu, Rodrigo Dorantes-Gilardi, and Albert-Laszlo Barabasi. 2025. Unequal Scientific Recognition in the Age of LLMs. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 23558–23568, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Unequal Scientific Recognition in the Age of LLMs (Liu et al., Findings 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.findings-emnlp.1279.pdf
Checklist:
 2025.findings-emnlp.1279.checklist.pdf