CPopQA: Ranking Cultural Concept Popularity by LLMs

Ming Jiang, Mansi Joshi


Abstract
Many recent studies examining the knowledge capacity of large language models (LLM) have focused on knowledge explicitly learned from the pretraining data or implicitly inferable from similar contexts. However, the extent to which an LLM effectively captures corpus-level statistical trends of concepts for reasoning, especially long-tail ones, is largely underexplored. In this study, we introduce a novel few-shot question-answering task (CPopQA) that examines LLMs’ statistical ranking abilities for long-tail cultural concepts (e.g., holidays), particularly focusing on these concepts’ popularity in the United States and the United Kingdom, respectively. We curate a dataset of 457 holidays across 58 countries, generating a total of 9,000 QA testing pairs. Experiments on four strong LLMs show that open-sourced LLMs still lag way behind close LLM API (e.g., GPT-3.5) in statistical ranking of cultural concepts. Notably, GPT-3.5 exhibited its potential to identify geo-cultural proximity across continents.
Anthology ID:
2024.naacl-short.52
Volume:
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 2: Short Papers)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
615–630
Language:
URL:
https://aclanthology.org/2024.naacl-short.52
DOI:
10.18653/v1/2024.naacl-short.52
Bibkey:
Cite (ACL):
Ming Jiang and Mansi Joshi. 2024. CPopQA: Ranking Cultural Concept Popularity by LLMs. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 2: Short Papers), pages 615–630, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
CPopQA: Ranking Cultural Concept Popularity by LLMs (Jiang & Joshi, NAACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.naacl-short.52.pdf