Nested Named Entity Recognition as Latent Lexicalized Constituency Parsing

Chao Lou, Songlin Yang, Kewei Tu


Abstract
Nested named entity recognition (NER) has been receiving increasing attention. Recently, Fu et al. (2020) adapt a span-based constituency parser to tackle nested NER. They treat nested entities as partially-observed constituency trees and propose the masked inside algorithm for partial marginalization. However, their method cannot leverage entity heads, which have been shown useful in entity mention detection and entity typing. In this work, we resort to more expressive structures, lexicalized constituency trees in which constituents are annotated by headwords, to model nested entities. We leverage the Eisner-Satta algorithm to perform partial marginalization and inference efficiently.In addition, we propose to use (1) a two-stage strategy (2) a head regularization loss and (3) a head-aware labeling loss in order to enhance the performance. We make a thorough ablation study to investigate the functionality of each component. Experimentally, our method achieves the state-of-the-art performance on ACE2004, ACE2005 and NNE, and competitive performance on GENIA, and meanwhile has a fast inference speed.
Anthology ID:
2022.acl-long.428
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6183–6198
Language:
URL:
https://aclanthology.org/2022.acl-long.428
DOI:
10.18653/v1/2022.acl-long.428
Bibkey:
Cite (ACL):
Chao Lou, Songlin Yang, and Kewei Tu. 2022. Nested Named Entity Recognition as Latent Lexicalized Constituency Parsing. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 6183–6198, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Nested Named Entity Recognition as Latent Lexicalized Constituency Parsing (Lou et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.428.pdf
Code
 louchao98/nner_as_parsing
Data
NNE