Contextualized Embeddings for Connective Disambiguation in Shallow Discourse Parsing

René Knaebel, Manfred Stede


Abstract
This paper studies a novel model that simplifies the disambiguation of connectives for explicit discourse relations. We use a neural approach that integrates contextualized word embeddings and predicts whether a connective candidate is part of a discourse relation or not. We study the influence of those context-specific embeddings. Further, we show the benefit of training the tasks of connective disambiguation and sense classification together at the same time. The success of our approach is supported by state-of-the-art results.
Anthology ID:
2020.codi-1.7
Volume:
Proceedings of the First Workshop on Computational Approaches to Discourse
Month:
November
Year:
2020
Address:
Online
Editors:
Chloé Braud, Christian Hardmeier, Junyi Jessy Li, Annie Louis, Michael Strube
Venue:
CODI
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
65–75
Language:
URL:
https://aclanthology.org/2020.codi-1.7
DOI:
10.18653/v1/2020.codi-1.7
Bibkey:
Cite (ACL):
René Knaebel and Manfred Stede. 2020. Contextualized Embeddings for Connective Disambiguation in Shallow Discourse Parsing. In Proceedings of the First Workshop on Computational Approaches to Discourse, pages 65–75, Online. Association for Computational Linguistics.
Cite (Informal):
Contextualized Embeddings for Connective Disambiguation in Shallow Discourse Parsing (Knaebel & Stede, CODI 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.codi-1.7.pdf
Video:
 https://slideslive.com/38939692