Feature-augmented model for multilingual discourse relation classification

Eleni Metheniti, Chloé Braud, Philippe Muller


Abstract
Discourse relation classification within a multilingual, cross-framework setting is a challenging task, and the best-performing systems so far have relied on monolingual and mono-framework approaches.In this paper, we introduce transformer-based multilingual models, trained jointly over all datasets—thus covering different languages and discourse frameworks. We demonstrate their ability to outperform single-corpus models and to overcome (to some extent) the disparity among corpora, by relying on linguistic features and generic information about the nature of the datasets. We also compare the performance of different multilingual pretrained models, as well as the encoding of the relation direction, a key component for the task. Our results on the 16 datasets of the DISRPT 2021 benchmark show improvements in accuracy in (almost) all datasets compared to the monolingual models, with at best 65.91% in average accuracy, thus corresponding to a 4% improvement over the state-of-the-art.
Anthology ID:
2024.codi-1.9
Volume:
Proceedings of the 5th Workshop on Computational Approaches to Discourse (CODI 2024)
Month:
March
Year:
2024
Address:
St. Julians, Malta
Editors:
Michael Strube, Chloe Braud, Christian Hardmeier, Junyi Jessy Li, Sharid Loaiciga, Amir Zeldes, Chuyuan Li
Venues:
CODI | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
91–104
Language:
URL:
https://aclanthology.org/2024.codi-1.9
DOI:
Bibkey:
Cite (ACL):
Eleni Metheniti, Chloé Braud, and Philippe Muller. 2024. Feature-augmented model for multilingual discourse relation classification. In Proceedings of the 5th Workshop on Computational Approaches to Discourse (CODI 2024), pages 91–104, St. Julians, Malta. Association for Computational Linguistics.
Cite (Informal):
Feature-augmented model for multilingual discourse relation classification (Metheniti et al., CODI-WS 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.codi-1.9.pdf
Supplementary material:
 2024.codi-1.9.SupplementaryMaterial.zip