Yueyang Li
2022
RGL: A Simple yet Effective Relation Graph Augmented Prompt-based Tuning Approach for Few-Shot Learning
Yaqing Wang
|
Xin Tian
|
Haoyi Xiong
|
Yueyang Li
|
Zeyu Chen
|
Sheng Guo
|
Dejing Dou
Findings of the Association for Computational Linguistics: NAACL 2022
Pre-trained language models (PLMs) can provide a good starting point for downstream applications. However, it is difficult to generalize PLMs to new tasks given a few labeled samples. In this work, we show that Relation Graph augmented Learning (RGL) can improve the performance of few-shot natural language understanding tasks. During learning, RGL constructs a relation graph based on the label consistency between samples in the same batch, and learns to solve the resultant node classification and link prediction problems on the relation graph. In this way, RGL fully exploits the limited supervised information, which can boost the tuning effectiveness. Extensive experimental results show that RGL consistently improves the performance of prompt-based tuning strategies.
Search
Co-authors
- Yaqing Wang 1
- Xin Tian 1
- Haoyi Xiong 1
- Zeyu Chen 1
- Sheng Guo 1
- show all...