AbsPyramid: Benchmarking the Abstraction Ability of Language Models with a Unified Entailment Graph

Zhaowei Wang, Haochen Shi, Weiqi Wang, Tianqing Fang, Hongming Zhang, Sehyun Choi, Xin Liu, Yangqiu Song


Abstract
Cognitive research indicates that abstraction ability is essential in human intelligence, which remains under-explored in language models. In this paper, we present AbsPyramid, a unified entailment graph of 221K textual descriptions of abstraction knowledge. While existing resources only touch nouns or verbs within simplified events or specific domains, AbsPyramid collects abstract knowledge for three components of diverse events to comprehensively evaluate the abstraction ability of language models in the open domain. Experimental results demonstrate that current LLMs face challenges comprehending abstraction knowledge in zero-shot and few-shot settings. By training on our rich abstraction knowledge, we find LLMs can acquire basic abstraction abilities and generalize to unseen events. In the meantime, we empirically show that our benchmark is comprehensive to enhance LLMs across two previous abstraction tasks.
Anthology ID:
2024.findings-naacl.252
Volume:
Findings of the Association for Computational Linguistics: NAACL 2024
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3991–4010
Language:
URL:
https://aclanthology.org/2024.findings-naacl.252
DOI:
Bibkey:
Cite (ACL):
Zhaowei Wang, Haochen Shi, Weiqi Wang, Tianqing Fang, Hongming Zhang, Sehyun Choi, Xin Liu, and Yangqiu Song. 2024. AbsPyramid: Benchmarking the Abstraction Ability of Language Models with a Unified Entailment Graph. In Findings of the Association for Computational Linguistics: NAACL 2024, pages 3991–4010, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
AbsPyramid: Benchmarking the Abstraction Ability of Language Models with a Unified Entailment Graph (Wang et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-naacl.252.pdf
Copyright:
 2024.findings-naacl.252.copyright.pdf