Prompt Tuning for Few-shot Relation Extraction via Modeling Global and Local Graphs

Zirui Zhang, Yiyu Yang, Benhui Chen


Abstract
Recently, prompt-tuning has achieved very significant results for few-shot tasks. The core idea of prompt-tuning is to insert prompt templates into the input, thus converting the classification task into a masked language modeling problem. However, for few-shot relation extraction tasks, how to mine more information from limited resources becomes particularly important. In this paper, we first construct a global relation graph based on label consistency to optimize the feature representation of samples between different relations. Then the global relation graph is further divided to form a local relation subgraph for each relation type to optimize the feature representation of samples within the same relation. This fully uses the limited supervised information and improves the tuning efficiency. In addition, the existence of rich semantic knowledge in relation labels cannot be ignored. For this reason, this paper incorporates the knowledge in relation labels into prompt-tuning. Specifically, the potential knowledge implicit in relation labels is injected into constructing learnable prompt templates. In this paper, we conduct extensive experiments on four datasets under low-resource settings, showing that this method achieves significant results.
Anthology ID:
2024.lrec-main.1158
Volume:
Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)
Month:
May
Year:
2024
Address:
Torino, Italia
Editors:
Nicoletta Calzolari, Min-Yen Kan, Veronique Hoste, Alessandro Lenci, Sakriani Sakti, Nianwen Xue
Venues:
LREC | COLING
SIG:
Publisher:
ELRA and ICCL
Note:
Pages:
13233–13242
Language:
URL:
https://aclanthology.org/2024.lrec-main.1158
DOI:
Bibkey:
Cite (ACL):
Zirui Zhang, Yiyu Yang, and Benhui Chen. 2024. Prompt Tuning for Few-shot Relation Extraction via Modeling Global and Local Graphs. In Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024), pages 13233–13242, Torino, Italia. ELRA and ICCL.
Cite (Informal):
Prompt Tuning for Few-shot Relation Extraction via Modeling Global and Local Graphs (Zhang et al., LREC-COLING 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.lrec-main.1158.pdf