Language is Scary when Over-Analyzed: Unpacking Implied Misogynistic Reasoning with Argumentation Theory-Driven Prompts

Arianna Muti, Federico Ruggeri, Khalid Khatib, Alberto Barrón-Cedeño, Tommaso Caselli


Abstract
We propose misogyny detection as an Argumentative Reasoning task and we investigate the capacity of large language models (LLMs) to understand the implicit reasoning used to convey misogyny in both Italian and English. The central aim is to generate the missing reasoning link between a message and the implied meanings encoding the misogyny. Our study uses argumentation theory as a foundation to form a collection of prompts in both zero-shot and few-shot settings. These prompts integrate different techniques, including chain-of-thought reasoning and augmented knowledge. Our findings show that LLMs fall short on reasoning capabilities about misogynistic comments and that they mostly rely on their implicit knowledge derived from internalized common stereotypes about women to generate implied assumptions, rather than on inductive reasoning.
Anthology ID:
2024.emnlp-main.1174
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
21091–21107
Language:
URL:
https://aclanthology.org/2024.emnlp-main.1174
DOI:
Bibkey:
Cite (ACL):
Arianna Muti, Federico Ruggeri, Khalid Khatib, Alberto Barrón-Cedeño, and Tommaso Caselli. 2024. Language is Scary when Over-Analyzed: Unpacking Implied Misogynistic Reasoning with Argumentation Theory-Driven Prompts. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 21091–21107, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Language is Scary when Over-Analyzed: Unpacking Implied Misogynistic Reasoning with Argumentation Theory-Driven Prompts (Muti et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.1174.pdf
Data:
 2024.emnlp-main.1174.data.zip