On Crowdsourcing Task Design for Discourse Relation Annotation

Frances Yung, Vera Demberg


Abstract
Interpreting implicit discourse relations involves complex reasoning, requiring the integration of semantic cues with background knowledge, as overt connectives like “because” or “then” are absent. These relations often allow multiple interpretations, best represented as distributions. In this study, we compare two established methods that crowdsource implicit discourse relation annotation by connective insertion: a free-choice approach, which allows annotators to select any suitable connective, and a forced-choice approach, which asks them to select among a set of predefined options. Specifically, we re-annotate the whole DiscoGeM 1.0 corpus - initially annotated with the free-choice method - using the forced-choice approach. The free-choice approach allows for flexible and intuitive insertion of various connectives, which are context-dependent. Comparison among over 130,000 annotations, however, shows that the free-choice strategy produces less diverse annotations, often converging on common labels. Analysis of the results reveals the interplay between task design and the annotators’ abilities to interpret and produce discourse relations.
Anthology ID:
2025.comedi-1.2
Volume:
Proceedings of Context and Meaning: Navigating Disagreements in NLP Annotation
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Michael Roth, Dominik Schlechtweg
Venues:
CoMeDi | WS
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
12–19
Language:
URL:
https://aclanthology.org/2025.comedi-1.2/
DOI:
Bibkey:
Cite (ACL):
Frances Yung and Vera Demberg. 2025. On Crowdsourcing Task Design for Discourse Relation Annotation. In Proceedings of Context and Meaning: Navigating Disagreements in NLP Annotation, pages 12–19, Abu Dhabi, UAE. International Committee on Computational Linguistics.
Cite (Informal):
On Crowdsourcing Task Design for Discourse Relation Annotation (Yung & Demberg, CoMeDi 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.comedi-1.2.pdf