Larger Probes Tell a Different Story: Extending Psycholinguistic Datasets Via In-Context Learning

Namrata Shivagunde, Vladislav Lialin, Anna Rumshisky


Abstract
Language model probing is often used to test specific capabilities of models. However, conclusions from such studies may be limited when the probing benchmarks are small and lack statistical power. In this work, we introduce new, larger datasets for negation (NEG-1500-SIMP) and role reversal (ROLE-1500) inspired by psycholinguistic studies. We dramatically extend existing NEG-136 and ROLE-88 benchmarks using GPT3, increasing their size from 18 and 44 sentence pairs to 750 each. We also create another version of extended negation dataset (NEG-1500-SIMP-TEMP), created using template-based generation. It consists of 770 sentence pairs. We evaluate 22 models on the extended datasets, seeing model performance dip 20-57% compared to the original smaller benchmarks. We observe high levels of negation sensitivity in models like BERT and ALBERT demonstrating that previous findings might have been skewed due to smaller test sets. Finally, we observe that while GPT3 has generated all the examples in ROLE-1500 is only able to solve 24.6% of them during probing. The datasets and code are available on Github.
Anthology ID:
2023.emnlp-main.130
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2094–2107
Language:
URL:
https://aclanthology.org/2023.emnlp-main.130
DOI:
10.18653/v1/2023.emnlp-main.130
Bibkey:
Cite (ACL):
Namrata Shivagunde, Vladislav Lialin, and Anna Rumshisky. 2023. Larger Probes Tell a Different Story: Extending Psycholinguistic Datasets Via In-Context Learning. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 2094–2107, Singapore. Association for Computational Linguistics.
Cite (Informal):
Larger Probes Tell a Different Story: Extending Psycholinguistic Datasets Via In-Context Learning (Shivagunde et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.130.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.130.mp4