Revealing the Myth of Higher-Order Inference in Coreference Resolution

Liyan Xu, Jinho D. Choi


Abstract
This paper analyzes the impact of higher-order inference (HOI) on the task of coreference resolution. HOI has been adapted by almost all recent coreference resolution models without taking much investigation on its true effectiveness over representation learning. To make a comprehensive analysis, we implement an end-to-end coreference system as well as four HOI approaches, attended antecedent, entity equalization, span clustering, and cluster merging, where the latter two are our original methods. We find that given a high-performing encoder such as SpanBERT, the impact of HOI is negative to marginal, providing a new perspective of HOI to this task. Our best model using cluster merging shows the Avg-F1 of 80.2 on the CoNLL 2012 shared task dataset in English.
Anthology ID:
2020.emnlp-main.686
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Editors:
Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8527–8533
Language:
URL:
https://aclanthology.org/2020.emnlp-main.686
DOI:
10.18653/v1/2020.emnlp-main.686
Bibkey:
Cite (ACL):
Liyan Xu and Jinho D. Choi. 2020. Revealing the Myth of Higher-Order Inference in Coreference Resolution. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 8527–8533, Online. Association for Computational Linguistics.
Cite (Informal):
Revealing the Myth of Higher-Order Inference in Coreference Resolution (Xu & Choi, EMNLP 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.emnlp-main.686.pdf
Video:
 https://slideslive.com/38938952
Code
 lxucs/coref-hoi
Data
CoNLLCoNLL-2012