Hu Yuan
2022
An Exploration of Prompt-Based Zero-Shot Relation Extraction Method
Zhao Jun
|
Hu Yuan
|
Xu Nuo
|
Gui Tao
|
Zhang Qi
|
Chen Yunwen
|
Gao Xiang
Proceedings of the 21st Chinese National Conference on Computational Linguistics
“Zero-shot relation extraction is an important method for dealing with the newly emerging relations in the real world which lacks labeled data. However, the mainstream two-tower zero-shot methods usually rely on large-scale and in-domain labeled data of predefined relations. In this work, we view zero-shot relation extraction as a semantic matching task optimized by prompt-tuning, which still maintains superior generalization performance when the labeled data of predefined relations are extremely scarce. To maximize the efficiency of data exploitation, instead of directly fine-tuning, we introduce a prompt-tuning technique to elicit the existing relational knowledge in pre-trained language model (PLMs). In addition, very few relation descriptions are exposed to the model during training, which we argue is the performance bottleneck of two-tower methods. To break through the bottleneck, we model the semantic interaction between relational instances and their descriptions directly during encoding. Experiment results on two academic datasets show that (1) our method outperforms the previous state-of-the-art method by a large margin with different samples of predefined relations; (2) this advantage will be further amplified in the low-resource scenario.”
Search
Co-authors
- Zhao Jun (军 赵) 1
- Xu Nuo 1
- Gui Tao 1
- Zhang Qi 1
- Chen Yunwen 1
- show all...
Venues
- ccl1