Jiasheng Zhang
2023
2INER: Instructive and In-Context Learning on Few-Shot Named Entity Recognition
Jiasheng Zhang
|
Xikai Liu
|
Xinyi Lai
|
Yan Gao
|
Shusen Wang
|
Yao Hu
|
Yiqing Lin
Findings of the Association for Computational Linguistics: EMNLP 2023
Prompt-based learning has emerged as a powerful technique in natural language processing (NLP) due to its ability to leverage pre-training knowledge for downstream few-shot tasks. In this paper, we propose 2INER, a novel text-to-text framework for Few-Shot Named Entity Recognition (NER) tasks. Our approach employs instruction finetuning based on InstructionNER to enable the model to effectively comprehend and process task-specific instructions, including both main and auxiliary tasks. We also introduce a new auxiliary task, called Type Extracting, to enhance the model’s understanding of entity types in the overall semantic context of a sentence. To facilitate in-context learning, we concatenate examples to the input, enabling the model to learn from additional contextual information. Experimental results on four datasets demonstrate that our approach outperforms existing Few-Shot NER methods and remains competitive with state-of-the-art standard NER algorithms.
2019
Treat the Word As a Whole or Look Inside? Subword Embeddings Model Language Change and Typology
Yang Xu
|
Jiasheng Zhang
|
David Reitter
Proceedings of the 1st International Workshop on Computational Approaches to Historical Language Change
We use a variant of word embedding model that incorporates subword information to characterize the degree of compositionality in lexical semantics. Our models reveal some interesting yet contrastive patterns of long-term change in multiple languages: Indo-European languages put more weight on subword units in newer words, while conversely Chinese puts less weights on the subwords, but more weight on the word as a whole. Our method provides novel evidence and methodology that enriches existing theories in evolutionary linguistics. The resulting word vectors also has decent performance in NLP-related tasks.