Are You Serious? Handling Disagreement When Annotating Conspiracy Theory Texts

Ashley Hemm, Sandra Kübler, Michelle Seelig, John Funchion, Manohar Murthi, Kamal Premaratne, Daniel Verdear, Stefan Wuchty


Abstract
We often assume that annotation tasks, such as annotating for the presence of conspiracy theories, can be annotated with hard labels, without definitions or guidelines. Our annotation experiments, comparing students and experts, show that there is little agreement on basic annotations even among experts. For this reason, we conclude that we need to accept disagreement as an integral part of such annotations.
Anthology ID:
2024.law-1.12
Volume:
Proceedings of The 18th Linguistic Annotation Workshop (LAW-XVIII)
Month:
March
Year:
2024
Address:
St. Julians, Malta
Editors:
Sophie Henning, Manfred Stede
Venues:
LAW | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
124–132
Language:
URL:
https://aclanthology.org/2024.law-1.12
DOI:
Bibkey:
Cite (ACL):
Ashley Hemm, Sandra Kübler, Michelle Seelig, John Funchion, Manohar Murthi, Kamal Premaratne, Daniel Verdear, and Stefan Wuchty. 2024. Are You Serious? Handling Disagreement When Annotating Conspiracy Theory Texts. In Proceedings of The 18th Linguistic Annotation Workshop (LAW-XVIII), pages 124–132, St. Julians, Malta. Association for Computational Linguistics.
Cite (Informal):
Are You Serious? Handling Disagreement When Annotating Conspiracy Theory Texts (Hemm et al., LAW-WS 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.law-1.12.pdf