A simple but effective model for attachment in discourse parsing with multi-task learning for relation labeling

Zineb Bennis, Julie Hunter, Nicholas Asher


Abstract
In this paper, we present a discourse parsing model for conversation trained on the STAC. We fine-tune a BERT-based model to encode pairs of discourse units and use a simple linear layer to predict discourse attachments. We then exploit a multi-task setting to predict relation labels. The multitask approach effectively aids in the difficult task of relation type prediction; our f1 score of 57 surpasses the state of the art with no loss in performance for attachment, confirming the intuitive interdependence of these two tasks. Our method also improves over previous discourse parsing models in allowing longer input sizes and in permitting attachments in which one node has multiple parents, an important feature of multiparty conversation.
Anthology ID:
2023.eacl-main.247
Volume:
Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3412–3417
Language:
URL:
https://aclanthology.org/2023.eacl-main.247
DOI:
10.18653/v1/2023.eacl-main.247
Bibkey:
Cite (ACL):
Zineb Bennis, Julie Hunter, and Nicholas Asher. 2023. A simple but effective model for attachment in discourse parsing with multi-task learning for relation labeling. In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics, pages 3412–3417, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
A simple but effective model for attachment in discourse parsing with multi-task learning for relation labeling (Bennis et al., EACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.eacl-main.247.pdf
Dataset:
 2023.eacl-main.247.dataset.zip
Video:
 https://aclanthology.org/2023.eacl-main.247.mp4