Graph-combined Coreference Resolution Methods on Conversational Machine Reading Comprehension with Pre-trained Language Model

Zhaodong Wang, Kazunori Komatani


Abstract
Coreference resolution such as for anaphora has been an essential challenge that is commonly found in conversational machine reading comprehension (CMRC). This task aims to determine the referential entity to which a pronoun refers on the basis of contextual information. Existing approaches based on pre-trained language models (PLMs) mainly rely on an end-to-end method, which still has limitations in clarifying referential dependency. In this study, a novel graph-based approach is proposed to integrate the coreference of given text into graph structures (called coreference graphs), which can pinpoint a pronoun’s referential entity. We propose two graph-combined methods, evidence-enhanced and the fusion model, for CMRC to integrate coreference graphs from different levels of the PLM architecture. Evidence-enhanced refers to textual level methods that include an evidence generator (for generating new text to elaborate a pronoun) and enhanced question (for rewriting a pronoun in a question) as PLM input. The fusion model is a structural level method that combines the PLM with a graph neural network. We evaluated these approaches on a CoQA pronoun-containing dataset and the whole CoQA dataset. The result showed that our methods can outperform baseline PLM methods with BERT and RoBERTa.
Anthology ID:
2022.dialdoc-1.8
Volume:
Proceedings of the Second DialDoc Workshop on Document-grounded Dialogue and Conversational Question Answering
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Song Feng, Hui Wan, Caixia Yuan, Han Yu
Venue:
dialdoc
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
72–82
Language:
URL:
https://aclanthology.org/2022.dialdoc-1.8
DOI:
10.18653/v1/2022.dialdoc-1.8
Bibkey:
Cite (ACL):
Zhaodong Wang and Kazunori Komatani. 2022. Graph-combined Coreference Resolution Methods on Conversational Machine Reading Comprehension with Pre-trained Language Model. In Proceedings of the Second DialDoc Workshop on Document-grounded Dialogue and Conversational Question Answering, pages 72–82, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Graph-combined Coreference Resolution Methods on Conversational Machine Reading Comprehension with Pre-trained Language Model (Wang & Komatani, dialdoc 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.dialdoc-1.8.pdf
Video:
 https://aclanthology.org/2022.dialdoc-1.8.mp4
Data
CANARDCoQA