Harmful Language Datasets: An Assessment of Robustness

Katerina Korre, John Pavlopoulos, Jeffrey Sorensen, Léo Laugier, Ion Androutsopoulos, Lucas Dixon, Alberto Barrón-cedeño


Abstract
The automated detection of harmful language has been of great importance for the online world, especially with the growing importance of social media and, consequently, polarisation. There are many open challenges to high quality detection of harmful text, from dataset creation to generalisable application, thus calling for more systematic studies. In this paper, we explore re-annotation as a means of examining the robustness of already existing labelled datasets, showing that, despite using alternative definitions, the inter-annotator agreement remains very inconsistent, highlighting the intrinsically subjective and variable nature of the task. In addition, we build automatic toxicity detectors using the existing datasets, with their original labels, and we evaluate them on our multi-definition and multi-source datasets. Surprisingly, while other studies show that hate speech detection models perform better on data that are derived from the same distribution as the training set, our analysis demonstrates this is not necessarily true.
Anthology ID:
2023.woah-1.24
Volume:
The 7th Workshop on Online Abuse and Harms (WOAH)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Yi-ling Chung, Paul R{\"ottger}, Debora Nozza, Zeerak Talat, Aida Mostafazadeh Davani
Venue:
WOAH
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
221–230
Language:
URL:
https://aclanthology.org/2023.woah-1.24
DOI:
10.18653/v1/2023.woah-1.24
Bibkey:
Cite (ACL):
Katerina Korre, John Pavlopoulos, Jeffrey Sorensen, Léo Laugier, Ion Androutsopoulos, Lucas Dixon, and Alberto Barrón-cedeño. 2023. Harmful Language Datasets: An Assessment of Robustness. In The 7th Workshop on Online Abuse and Harms (WOAH), pages 221–230, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Harmful Language Datasets: An Assessment of Robustness (Korre et al., WOAH 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.woah-1.24.pdf