Less than One-shot: Named Entity Recognition via Extremely Weak Supervision

Letian Peng, Zihan Wang, Jingbo Shang


Abstract
We study the named entity recognition (NER) problem under the extremely weak supervision (XWS) setting, where only one example entity per type is given in a context-free way. While one can see that XWS is lighter than one-shot in terms of the amount of supervision, we propose a novel method X-NER that can outperform the state-of-the-art one-shot NER methods. We first mine entity spans that are similar to the example entities from an unlabelled training corpus. Instead of utilizing entity span representations from language models, we find it more effective to compare the context distributions before and after the span is replaced by the entity example. We then leverage the top-ranked spans as pseudo-labels to train an NER tagger. Extensive experiments and analyses on 4 NER datasets show the superior end-to-end NER performance of X-NER, outperforming the state-of-the-art few-shot methods with 1-shot supervision and ChatGPT annotations significantly. Finally, our X-NER possesses several notable properties, such as inheriting the cross-lingual abilities of the underlying language models.
Anthology ID:
2023.findings-emnlp.908
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13603–13616
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.908
DOI:
10.18653/v1/2023.findings-emnlp.908
Bibkey:
Cite (ACL):
Letian Peng, Zihan Wang, and Jingbo Shang. 2023. Less than One-shot: Named Entity Recognition via Extremely Weak Supervision. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 13603–13616, Singapore. Association for Computational Linguistics.
Cite (Informal):
Less than One-shot: Named Entity Recognition via Extremely Weak Supervision (Peng et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.908.pdf