An Embarrassingly Easy but Strong Baseline for Nested Named Entity Recognition

Hang Yan, Yu Sun, Xiaonan Li, Xipeng Qiu


Abstract
Named entity recognition (NER) is the task to detect and classify entity spans in the text. When entity spans overlap between each other, the task is named as nested NER. Span-based methods have been widely used to tackle nested NER. Most of these methods get a score matrix, where each entry corresponds to a span. However, previous work ignores spatial relations in the score matrix. In this paper, we propose using Convolutional Neural Network (CNN) to model these spatial relations. Despite being simple, experiments in three commonly used nested NER datasets show that our model surpasses several recently proposed methods with the same pre-trained encoders. Further analysis shows that using CNN can help the model find more nested entities. Besides, we find that different papers use different sentence tokenizations for the three nested NER datasets, which will influence the comparison. Thus, we release a pre-processing script to facilitate future comparison.
Anthology ID:
2023.acl-short.123
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1442–1452
Language:
URL:
https://aclanthology.org/2023.acl-short.123
DOI:
10.18653/v1/2023.acl-short.123
Bibkey:
Cite (ACL):
Hang Yan, Yu Sun, Xiaonan Li, and Xipeng Qiu. 2023. An Embarrassingly Easy but Strong Baseline for Nested Named Entity Recognition. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 1442–1452, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
An Embarrassingly Easy but Strong Baseline for Nested Named Entity Recognition (Yan et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-short.123.pdf
Video:
 https://aclanthology.org/2023.acl-short.123.mp4