A Unified Representation Learning Strategy for Open Relation Extraction with Ranked List Loss

Lou Renze, Zhang Fan, Zhou Xiaowei, Wang Yutong, Wu Minghui, Sun Lin


Abstract
Open Relation Extraction (OpenRE) aiming to extract relational facts from open-domain cor-pora is a sub-task of Relation Extraction and a crucial upstream process for many other NLPtasks. However various previous clustering-based OpenRE strategies either confine themselves to unsupervised paradigms or can not directly build a unified relational semantic space henceimpacting down-stream clustering. In this paper we propose a novel supervised learning frame-work named MORE-RLL (Metric learning-based Open Relation Extraction with Ranked ListLoss) to construct a semantic metric space by utilizing Ranked List Loss to discover new rela-tional facts. Experiments on real-world datasets show that MORE-RLL can achieve excellent performance compared with previous state-of-the-art methods demonstrating the capability of MORE-RLL in unified semantic representation learning and novel relational fact detection.
Anthology ID:
2021.ccl-1.98
Volume:
Proceedings of the 20th Chinese National Conference on Computational Linguistics
Month:
August
Year:
2021
Address:
Huhhot, China
Editors:
Sheng Li (李生), Maosong Sun (孙茂松), Yang Liu (刘洋), Hua Wu (吴华), Kang Liu (刘康), Wanxiang Che (车万翔), Shizhu He (何世柱), Gaoqi Rao (饶高琦)
Venue:
CCL
SIG:
Publisher:
Chinese Information Processing Society of China
Note:
Pages:
1096–1108
Language:
English
URL:
https://aclanthology.org/2021.ccl-1.98
DOI:
Bibkey:
Cite (ACL):
Lou Renze, Zhang Fan, Zhou Xiaowei, Wang Yutong, Wu Minghui, and Sun Lin. 2021. A Unified Representation Learning Strategy for Open Relation Extraction with Ranked List Loss. In Proceedings of the 20th Chinese National Conference on Computational Linguistics, pages 1096–1108, Huhhot, China. Chinese Information Processing Society of China.
Cite (Informal):
A Unified Representation Learning Strategy for Open Relation Extraction with Ranked List Loss (Renze et al., CCL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.ccl-1.98.pdf
Data
FewRelNew York Times Annotated Corpus