ScdNER: Span-Based Consistency-Aware Document-Level Named Entity Recognition

Ying Wei, Qi Li


Abstract
Document-level NER approaches use global information via word-based key-value memory for accurate and consistent predictions. However, such global information on word level can introduce noise when the same word appears in different token sequences and has different labels. This work proposes a two-stage document-level NER model, ScdNER, for more accurate and consistent predictions via adaptive span-level global feature fusion. In the first stage, ScdNER trains a binary classifier to predict if a token sequence is an entity with a probability. Via a span-based key-value memory, the probabilities are further used to obtain the entity’s global features with reduced impact of non-entity sequences. The second stage predicts the entity types using a gate mechanism to balance its local and global information, leading to adaptive global feature fusion. Experiments on benchmark datasets from scientific, biomedical, and general domains show the effectiveness of the proposed methods.
Anthology ID:
2023.emnlp-main.970
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
15677–15685
Language:
URL:
https://aclanthology.org/2023.emnlp-main.970
DOI:
10.18653/v1/2023.emnlp-main.970
Bibkey:
Cite (ACL):
Ying Wei and Qi Li. 2023. ScdNER: Span-Based Consistency-Aware Document-Level Named Entity Recognition. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 15677–15685, Singapore. Association for Computational Linguistics.
Cite (Informal):
ScdNER: Span-Based Consistency-Aware Document-Level Named Entity Recognition (Wei & Li, EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.970.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.970.mp4