The Role of Context and Uncertainty in Shallow Discourse Parsing

Katherine Atwell, Remi Choi, Junyi Jessy Li, Malihe Alikhani


Abstract
Discourse parsing has proven to be useful for a number of NLP tasks that require complex reasoning. However, over a decade since the advent of the Penn Discourse Treebank, predicting implicit discourse relations in text remains challenging. There are several possible reasons for this, and we hypothesize that models should be exposed to more context as it plays an important role in accurate human annotation; meanwhile adding uncertainty measures can improve model accuracy and calibration. To thoroughly investigate this phenomenon, we perform a series of experiments to determine 1) the effects of context on human judgments, and 2) the effect of quantifying uncertainty with annotator confidence ratings on model accuracy and calibration (which we measure using the Brier score (Brier et al, 1950)). We find that including annotator accuracy and confidence improves model accuracy, and incorporating confidence in the model’s temperature function can lead to models with significantly better-calibrated confidence measures. We also find some insightful qualitative results regarding human and model behavior on these datasets.
Anthology ID:
2022.coling-1.67
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
797–811
Language:
URL:
https://aclanthology.org/2022.coling-1.67
DOI:
Bibkey:
Cite (ACL):
Katherine Atwell, Remi Choi, Junyi Jessy Li, and Malihe Alikhani. 2022. The Role of Context and Uncertainty in Shallow Discourse Parsing. In Proceedings of the 29th International Conference on Computational Linguistics, pages 797–811, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
The Role of Context and Uncertainty in Shallow Discourse Parsing (Atwell et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.67.pdf