Online Coreference Resolution for Dialogue Processing: Improving Mention-Linking on Real-Time Conversations

Liyan Xu, Jinho D. Choi


Abstract
This paper suggests a direction of coreference resolution for online decoding on actively generated input such as dialogue, where the model accepts an utterance and its past context, then finds mentions in the current utterance as well as their referents, upon each dialogue turn. A baseline and four incremental updated models adapted from the mention linking paradigm are proposed for this new setting, which address different aspects including the singletons, speaker-grounded encoding and cross-turn mention contextualization. Our approach is assessed on three datasets: Friends, OntoNotes, and BOLT. Results show that each aspect brings out steady improvement, and our best models outperform the baseline by over 10%, presenting an effective system for this setting. Further analysis highlights the task characteristics, such as the significance of addressing the mention recall.
Anthology ID:
2022.starsem-1.30
Volume:
Proceedings of the 11th Joint Conference on Lexical and Computational Semantics
Month:
July
Year:
2022
Address:
Seattle, Washington
Editors:
Vivi Nastase, Ellie Pavlick, Mohammad Taher Pilehvar, Jose Camacho-Collados, Alessandro Raganato
Venue:
*SEM
SIG:
SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
341–347
Language:
URL:
https://aclanthology.org/2022.starsem-1.30
DOI:
10.18653/v1/2022.starsem-1.30
Bibkey:
Cite (ACL):
Liyan Xu and Jinho D. Choi. 2022. Online Coreference Resolution for Dialogue Processing: Improving Mention-Linking on Real-Time Conversations. In Proceedings of the 11th Joint Conference on Lexical and Computational Semantics, pages 341–347, Seattle, Washington. Association for Computational Linguistics.
Cite (Informal):
Online Coreference Resolution for Dialogue Processing: Improving Mention-Linking on Real-Time Conversations (Xu & Choi, *SEM 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.starsem-1.30.pdf