Discourse Structure Extraction from Pre-Trained and Fine-Tuned Language Models in Dialogues

Chuyuan Li, Patrick Huber, Wen Xiao, Maxime Amblard, Chloe Braud, Giuseppe Carenini


Abstract
Discourse processing suffers from data sparsity, especially for dialogues. As a result, we explore approaches to infer latent discourse structures for dialogues, based on attention matrices from Pre-trained Language Models (PLMs). We investigate multiple auxiliary tasks for fine-tuning and show that the dialogue-tailored Sentence Ordering task performs best. To locate and exploit discourse information in PLMs, we propose an unsupervised and a semi-supervised method. Our proposals thereby achieve encouraging results on the STAC corpus, with F1 scores of 57.2 and 59.3 for the unsupervised and semi-supervised methods, respectively. When restricted to projective trees, our scores improved to 63.3 and 68.1.
Anthology ID:
2023.findings-eacl.194
Volume:
Findings of the Association for Computational Linguistics: EACL 2023
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2562–2579
Language:
URL:
https://aclanthology.org/2023.findings-eacl.194
DOI:
10.18653/v1/2023.findings-eacl.194
Bibkey:
Cite (ACL):
Chuyuan Li, Patrick Huber, Wen Xiao, Maxime Amblard, Chloe Braud, and Giuseppe Carenini. 2023. Discourse Structure Extraction from Pre-Trained and Fine-Tuned Language Models in Dialogues. In Findings of the Association for Computational Linguistics: EACL 2023, pages 2562–2579, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Discourse Structure Extraction from Pre-Trained and Fine-Tuned Language Models in Dialogues (Li et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-eacl.194.pdf
Video:
 https://aclanthology.org/2023.findings-eacl.194.mp4