%0 Conference Proceedings %T FPC: Fine-tuning with Prompt Curriculum for Relation Extraction %A Yang, Sicheng %A Song, Dandan %Y He, Yulan %Y Ji, Heng %Y Li, Sujian %Y Liu, Yang %Y Chang, Chua-Hui %S Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 1: Long Papers) %D 2022 %8 November %I Association for Computational Linguistics %C Online only %F yang-song-2022-fpc %X The current classification methods for relation extraction (RE) generally utilize pre-trained language models (PLMs) and have achieved superior results. However, such methods directly treat relation labels as class numbers, therefore they ignore the semantics of relation labels. Recently, prompt-based fine-tuning has been proposed and attracted much attention. This kind of methods insert templates into the input and convert the classification task to a (masked) language modeling problem. With this inspiration, we propose a novel method Fine-tuning with Prompt Curriculum (FPC) for RE, with two distinctive characteristics: the relation prompt learning, introducing an auxiliary prompt-based fine-tuning task to make the model capture the semantics of relation labels; the prompt learning curriculum, a fine-tuning procedure including an increasingly difficult task to adapt the model to the difficult multi-task setting. We have conducted extensive experiments on four widely used RE benchmarks under fully supervised and low-resource settings. The experimental results show that FPC can significantly outperform the existing methods and obtain the new state-of-the-art results. %U https://aclanthology.org/2022.aacl-main.78 %P 1065-1077