ePiC: Employing Proverbs in Context as a Benchmark for Abstract Language Understanding

Sayan Ghosh, Shashank Srivastava


Abstract
While large language models have shown exciting progress on several NLP benchmarks, evaluating their ability for complex analogical reasoning remains under-explored. Here, we introduce a high-quality crowdsourced dataset of narratives for employing proverbs in context as a benchmark for abstract language understanding. The dataset provides fine-grained annotation of aligned spans between proverbs and narratives, and contains minimal lexical overlaps between narratives and proverbs, ensuring that models need to go beyond surface-level reasoning to succeed. We explore three tasks: (1) proverb recommendation and alignment prediction, (2) narrative generation for a given proverb and topic, and (3) identifying narratives with similar motifs. Our experiments show that neural language models struggle on these tasks compared to humans, and these tasks pose multiple learning challenges.
Anthology ID:
2022.acl-long.276
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3989–4004
Language:
URL:
https://aclanthology.org/2022.acl-long.276
DOI:
10.18653/v1/2022.acl-long.276
Bibkey:
Cite (ACL):
Sayan Ghosh and Shashank Srivastava. 2022. ePiC: Employing Proverbs in Context as a Benchmark for Abstract Language Understanding. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 3989–4004, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
ePiC: Employing Proverbs in Context as a Benchmark for Abstract Language Understanding (Ghosh & Srivastava, ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.276.pdf
Software:
 2022.acl-long.276.software.zip
Video:
 https://aclanthology.org/2022.acl-long.276.mp4
Video:
 https://aclanthology.org/2022.acl-long.276.mp4
Code
 sgdgp/epic
Data
ePiC