MZET: Memory Augmented Zero-Shot Fine-grained Named Entity Typing

Tao Zhang, Congying Xia, Chun-Ta Lu, Philip Yu


Abstract
Named entity typing (NET) is a classification task of assigning an entity mention in the context with given semantic types. However, with the growing size and granularity of the entity types, few previous researches concern with newly emerged entity types. In this paper, we propose MZET, a novel memory augmented FNET (Fine-grained NET) model, to tackle the unseen types in a zero-shot manner. MZET incorporates character-level, word-level, and contextural-level information to learn the entity mention representation. Besides, MZET considers the semantic meaning and the hierarchical structure into the entity type representation. Finally, through the memory component which models the relationship between the entity mention and the entity type, MZET transfers the knowledge from seen entity types to the zero-shot ones. Extensive experiments on three public datasets show the superior performance obtained by MZET, which surpasses the state-of-the-art FNET neural network models with up to 8% gain in Micro-F1 and Macro-F1 score.
Anthology ID:
2020.coling-main.7
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
77–87
Language:
URL:
https://aclanthology.org/2020.coling-main.7
DOI:
10.18653/v1/2020.coling-main.7
Bibkey:
Cite (ACL):
Tao Zhang, Congying Xia, Chun-Ta Lu, and Philip Yu. 2020. MZET: Memory Augmented Zero-Shot Fine-grained Named Entity Typing. In Proceedings of the 28th International Conference on Computational Linguistics, pages 77–87, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
MZET: Memory Augmented Zero-Shot Fine-grained Named Entity Typing (Zhang et al., COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.7.pdf