Nested Named Entity Recognition with Span-level Graphs

Juncheng Wan, Dongyu Ru, Weinan Zhang, Yong Yu


Abstract
Span-based methods with the neural networks backbone have great potential for the nested named entity recognition (NER) problem. However, they face problems such as degenerating when positive instances and negative instances largely overlap. Besides, the generalization ability matters a lot in nested NER, as a large proportion of entities in the test set hardly appear in the training set. In this work, we try to improve the span representation by utilizing retrieval-based span-level graphs, connecting spans and entities in the training data based on n-gram features. Specifically, we build the entity-entity graph and span-entity graph globally based on n-gram similarity to integrate the information of similar neighbor entities into the span representation. To evaluate our method, we conduct experiments on three common nested NER datasets, ACE2004, ACE2005, and GENIA datasets. Experimental results show that our method achieves general improvements on all three benchmarks (+0.30 ∼ 0.85 micro-F1), and obtains special superiority on low frequency entities (+0.56 ∼ 2.08 recall).
Anthology ID:
2022.acl-long.63
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
892–903
Language:
URL:
https://aclanthology.org/2022.acl-long.63
DOI:
10.18653/v1/2022.acl-long.63
Bibkey:
Cite (ACL):
Juncheng Wan, Dongyu Ru, Weinan Zhang, and Yong Yu. 2022. Nested Named Entity Recognition with Span-level Graphs. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 892–903, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Nested Named Entity Recognition with Span-level Graphs (Wan et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.63.pdf
Data
GENIA