Dialogue Discourse Parsing as Generation: A Sequence-to-Sequence LLM-based Approach

Chuyuan Li, Yuwei Yin, Giuseppe Carenini


Abstract
Existing works on dialogue discourse parsing mostly utilize encoder-only models and sophisticated decoding strategies to extract structures. Despite recent advances in Large Language Models (LLMs), there has been little work applying directly these models on discourse parsing. To fully utilize the rich semantic and discourse knowledge in LLMs, we explore the feasibility of transforming discourse parsing into a generation task using a text-to-text paradigm. Our approach is intuitive and requires no modification of the LLM architecture. Experimental results on STAC and Molweni datasets show that a sequence-to-sequence model such as T0 can perform reasonably well. Notably, our improved transition-based sequence-to-sequence system achieves new state-of-the-art performance on Molweni, demonstrating the effectiveness of the proposed method. Furthermore, our systems can generate richer discourse structures such as directed acyclic graphs, whereas previous methods are limited to trees.
Anthology ID:
2024.sigdial-1.1
Volume:
Proceedings of the 25th Annual Meeting of the Special Interest Group on Discourse and Dialogue
Month:
September
Year:
2024
Address:
Kyoto, Japan
Editors:
Tatsuya Kawahara, Vera Demberg, Stefan Ultes, Koji Inoue, Shikib Mehri, David Howcroft, Kazunori Komatani
Venue:
SIGDIAL
SIG:
SIGDIAL
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–14
Language:
URL:
https://aclanthology.org/2024.sigdial-1.1
DOI:
10.18653/v1/2024.sigdial-1.1
Bibkey:
Cite (ACL):
Chuyuan Li, Yuwei Yin, and Giuseppe Carenini. 2024. Dialogue Discourse Parsing as Generation: A Sequence-to-Sequence LLM-based Approach. In Proceedings of the 25th Annual Meeting of the Special Interest Group on Discourse and Dialogue, pages 1–14, Kyoto, Japan. Association for Computational Linguistics.
Cite (Informal):
Dialogue Discourse Parsing as Generation: A Sequence-to-Sequence LLM-based Approach (Li et al., SIGDIAL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.sigdial-1.1.pdf