Topic-Oriented Open Relation Extraction with A Priori Seed Generation

Linyi Ding, Jinfeng Xiao, Sizhe Zhou, Chaoqi Yang, Jiawei Han


Abstract
The field of open relation extraction (ORE) has recently observed significant advancement thanks to the growing capability of large language models (LLMs). Nevertheless, challenges persist when ORE is performed on specific topics. Existing methods give sub-optimal results in five dimensions: factualness, topic relevance, informativeness, coverage, and uniformity. To improve topic-oriented ORE, we propose a zero-shot approach called PriORE: Open Relation Extraction with a Priori seed generation. PriORE leverages the built-in knowledge of LLMs to maintain a dynamic seed relation dictionary for the topic. The dictionary is initialized by seed relations generated from topic-relevant entity types and expanded during contextualized ORE. PriORE then reduces the randomness in generative ORE by converting it to a more robust relation classification task. Experiments show the approach empowers better topic-oriented control over the generated relations and thus improves ORE performance along the five dimensions, especially on specialized and narrow topics.
Anthology ID:
2024.emnlp-main.766
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13834–13845
Language:
URL:
https://aclanthology.org/2024.emnlp-main.766
DOI:
Bibkey:
Cite (ACL):
Linyi Ding, Jinfeng Xiao, Sizhe Zhou, Chaoqi Yang, and Jiawei Han. 2024. Topic-Oriented Open Relation Extraction with A Priori Seed Generation. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 13834–13845, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Topic-Oriented Open Relation Extraction with A Priori Seed Generation (Ding et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.766.pdf