Improving Named Entity Recognition with Attentive Ensemble of Syntactic Information

Yuyang Nie, Yuanhe Tian, Yan Song, Xiang Ao, Xiang Wan


Abstract
Named entity recognition (NER) is highly sensitive to sentential syntactic and semantic properties where entities may be extracted according to how they are used and placed in the running text. To model such properties, one could rely on existing resources to providing helpful knowledge to the NER task; some existing studies proved the effectiveness of doing so, and yet are limited in appropriately leveraging the knowledge such as distinguishing the important ones for particular context. In this paper, we improve NER by leveraging different types of syntactic information through attentive ensemble, which functionalizes by the proposed key-value memory networks, syntax attention, and the gate mechanism for encoding, weighting and aggregating such syntactic information, respectively. Experimental results on six English and Chinese benchmark datasets suggest the effectiveness of the proposed model and show that it outperforms previous studies on all experiment datasets.
Anthology ID:
2020.findings-emnlp.378
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4231–4245
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.378
DOI:
10.18653/v1/2020.findings-emnlp.378
Bibkey:
Cite (ACL):
Yuyang Nie, Yuanhe Tian, Yan Song, Xiang Ao, and Xiang Wan. 2020. Improving Named Entity Recognition with Attentive Ensemble of Syntactic Information. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 4231–4245, Online. Association for Computational Linguistics.
Cite (Informal):
Improving Named Entity Recognition with Attentive Ensemble of Syntactic Information (Nie et al., Findings 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.findings-emnlp.378.pdf
Code
 cuhksz-nlp/AESINER
Data
OntoNotes 4.0OntoNotes 5.0Resume NERWNUT 2016 NERWNUT 2017Weibo NER