Probing Power by Prompting: Harnessing Pre-trained Language Models for Power Connotation Framing

Shima Khanehzar, Trevor Cohn, Gosia Mikolajczak, Lea Frermann


Abstract
When describing actions, subtle changes in word choice can evoke very different associations with the involved entities. For instance, a company ‘employing workers’ evokes a more positive connotation than the one ‘exploiting’ them. This concept is called connotation. This paper investigates whether pre-trained language models (PLMs) encode such subtle connotative information about power differentials between involved entities. We design a probing framework for power connotation, building on (CITATION)’s operationalization of connotation frames. We show that zero-shot prompting of PLMs leads to above chance prediction of power connotation, however fine-tuning PLMs using our framework drastically improves their accuracy. Using our fine-tuned models, we present a case study of power dynamics in US news reporting on immigration, showing the potential of our framework as a tool for understanding subtle bias in the media.
Anthology ID:
2023.eacl-main.61
Volume:
Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
873–885
Language:
URL:
https://aclanthology.org/2023.eacl-main.61
DOI:
10.18653/v1/2023.eacl-main.61
Bibkey:
Cite (ACL):
Shima Khanehzar, Trevor Cohn, Gosia Mikolajczak, and Lea Frermann. 2023. Probing Power by Prompting: Harnessing Pre-trained Language Models for Power Connotation Framing. In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics, pages 873–885, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Probing Power by Prompting: Harnessing Pre-trained Language Models for Power Connotation Framing (Khanehzar et al., EACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.eacl-main.61.pdf
Video:
 https://aclanthology.org/2023.eacl-main.61.mp4