Explore Unsupervised Structures in Pretrained Models for Relation Extraction

Xi Yang, Tao Ji, Yuanbin Wu


Abstract
Syntactic trees have been widely applied in relation extraction (RE). However, since parsing qualities are not stable on different text domains and a pre-defined grammar may not well fit the target relation schema, the introduction of syntactic structures sometimes fails to improve RE performances consistently. In this work, we study RE models with various unsupervised structures mined from pre-trained language models (e.g., BERT). We show that, similar to syntactic trees, unsupervised structures are quite informative for RE task: they are able to obtain competitive (even the best) performance scores on benchmark RE datasets (ACE05, WebNLG, SciERC). We also conduct detailed analyses on their abilities of adapting new RE domains and influence of noise links in those structures. The results suggest that unsupervised structures are reasonable alternatives of commonly used syntactic structures in relation extraction models.
Anthology ID:
2022.findings-emnlp.453
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6103–6117
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.453
DOI:
10.18653/v1/2022.findings-emnlp.453
Bibkey:
Cite (ACL):
Xi Yang, Tao Ji, and Yuanbin Wu. 2022. Explore Unsupervised Structures in Pretrained Models for Relation Extraction. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 6103–6117, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Explore Unsupervised Structures in Pretrained Models for Relation Extraction (Yang et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-emnlp.453.pdf