Neural Language Modeling for Named Entity Recognition

Zhihong Lei, Weiyue Wang, Christian Dugast, Hermann Ney


Abstract
Named entity recognition is a key component in various natural language processing systems, and neural architectures provide significant improvements over conventional approaches. Regardless of different word embedding and hidden layer structures of the networks, a conditional random field layer is commonly used for the output. This work proposes to use a neural language model as an alternative to the conditional random field layer, which is more flexible for the size of the corpus. Experimental results show that the proposed system has a significant advantage in terms of training speed, with a marginal performance degradation.
Anthology ID:
2020.coling-main.612
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
6937–6941
Language:
URL:
https://aclanthology.org/2020.coling-main.612
DOI:
10.18653/v1/2020.coling-main.612
Bibkey:
Cite (ACL):
Zhihong Lei, Weiyue Wang, Christian Dugast, and Hermann Ney. 2020. Neural Language Modeling for Named Entity Recognition. In Proceedings of the 28th International Conference on Computational Linguistics, pages 6937–6941, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Neural Language Modeling for Named Entity Recognition (Lei et al., COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.612.pdf