RFBFN: A Relation-First Blank Filling Network for Joint Relational Triple Extraction

Zhe Li, Luoyi Fu, Xinbing Wang, Haisong Zhang, Chenghu Zhou


Abstract
Joint relational triple extraction from unstructured text is an important task in information extraction. However, most existing works either ignore the semantic information of relations or predict subjects and objects sequentially. To address the issues, we introduce a new blank filling paradigm for the task, and propose a relation-first blank filling network (RFBFN). Specifically, we first detect potential relations maintained in the text to aid the following entity pair extraction. Then, we transform relations into relation templates with blanks which contain the fine-grained semantic representation of the relations. Finally, corresponding subjects and objects are extracted simultaneously by filling the blanks. We evaluate the proposed model on public benchmark datasets. Experimental results show our model outperforms current state-of-the-art methods. The source code of our work is available at: https://github.com/lizhe2016/RFBFN.
Anthology ID:
2022.acl-srw.2
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Samuel Louvan, Andrea Madotto, Brielen Madureira
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10–20
Language:
URL:
https://aclanthology.org/2022.acl-srw.2
DOI:
10.18653/v1/2022.acl-srw.2
Bibkey:
Cite (ACL):
Zhe Li, Luoyi Fu, Xinbing Wang, Haisong Zhang, and Chenghu Zhou. 2022. RFBFN: A Relation-First Blank Filling Network for Joint Relational Triple Extraction. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop, pages 10–20, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
RFBFN: A Relation-First Blank Filling Network for Joint Relational Triple Extraction (Li et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-srw.2.pdf
Code
 lizhe2016/rfbfn