HiURE: Hierarchical Exemplar Contrastive Learning for Unsupervised Relation Extraction

Shuliang Liu, Xuming Hu, Chenwei Zhang, Shu’ang Li, Lijie Wen, Philip Yu


Abstract
Unsupervised relation extraction aims to extract the relationship between entities from natural language sentences without prior information on relational scope or distribution. Existing works either utilize self-supervised schemes to refine relational feature signals by iteratively leveraging adaptive clustering and classification that provoke gradual drift problems, or adopt instance-wise contrastive learning which unreasonably pushes apart those sentence pairs that are semantically similar. To overcome these defects, we propose a novel contrastive learning framework named HiURE, which has the capability to derive hierarchical signals from relational feature space using cross hierarchy attention and effectively optimize relation representation of sentences under exemplar-wise contrastive learning. Experimental results on two public datasets demonstrate the advanced effectiveness and robustness of HiURE on unsupervised relation extraction when compared with state-of-the-art models.
Anthology ID:
2022.naacl-main.437
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5970–5980
Language:
URL:
https://aclanthology.org/2022.naacl-main.437
DOI:
10.18653/v1/2022.naacl-main.437
Bibkey:
Cite (ACL):
Shuliang Liu, Xuming Hu, Chenwei Zhang, Shu’ang Li, Lijie Wen, and Philip Yu. 2022. HiURE: Hierarchical Exemplar Contrastive Learning for Unsupervised Relation Extraction. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 5970–5980, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
HiURE: Hierarchical Exemplar Contrastive Learning for Unsupervised Relation Extraction (Liu et al., NAACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.naacl-main.437.pdf
Software:
 2022.naacl-main.437.software.zip
Video:
 https://aclanthology.org/2022.naacl-main.437.mp4
Code
 thu-bpm/hiure
Data
New York Times Annotated Corpus