A Topicality-Driven QUD Model for Discourse Processing

Yingxue Fu, Mark-Jan Nederhof, Anais Ollagnier


Abstract
Question Under Discussion (QUD) is a discourse framework that has attracted growing interest in NLP in recent years. Among existing QUD models, the QUD tree approach (Riester, 2019) focuses on reconstructing QUDs and their hierarchical relationships, using a single tree to represent discourse structure. Prior implementation shows moderate inter-annotator agreement, highlighting the challenging nature of this task. In this paper, we propose a new QUD model for annotating hierarchical discourse structure. Our annotation achieves high inter-annotator agreement: 81.45% for short files and 79.53% for long files of Wall Street Journal articles. We show preliminary results on using GPT-4 for automatic annotation, which suggests that one of the best-performing LLMs still struggles with capturing hierarchical discourse structure. Moreover, we compare the annotations with RST annotations. Lastly, we present an approach for integrating hierarchical and local discourse relation annotations with the proposed model.
Anthology ID:
2025.sigdial-1.17
Volume:
Proceedings of the 26th Annual Meeting of the Special Interest Group on Discourse and Dialogue
Month:
August
Year:
2025
Address:
Avignon, France
Editors:
Frédéric Béchet, Fabrice Lefèvre, Nicholas Asher, Seokhwan Kim, Teva Merlin
Venue:
SIGDIAL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
214–230
Language:
URL:
https://aclanthology.org/2025.sigdial-1.17/
DOI:
Bibkey:
Cite (ACL):
Yingxue Fu, Mark-Jan Nederhof, and Anais Ollagnier. 2025. A Topicality-Driven QUD Model for Discourse Processing. In Proceedings of the 26th Annual Meeting of the Special Interest Group on Discourse and Dialogue, pages 214–230, Avignon, France. Association for Computational Linguistics.
Cite (Informal):
A Topicality-Driven QUD Model for Discourse Processing (Fu et al., SIGDIAL 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.sigdial-1.17.pdf