Improving Unsupervised Relation Extraction by Augmenting Diverse Sentence Pairs

Qing Wang, Kang Zhou, Qiao Qiao, Yuepei Li, Qi Li


Abstract
Unsupervised relation extraction (URE) aims to extract relations between named entities from raw text without requiring manual annotations or pre-existing knowledge bases. In recent studies of URE, researchers put a notable emphasis on contrastive learning strategies for acquiring relation representations. However, these studies often overlook two important aspects: the inclusion of diverse positive pairs for contrastive learning and the exploration of appropriate loss functions. In this paper, we propose AugURE with both within-sentence pairs augmentation and augmentation through cross-sentence pairs extraction to increase the diversity of positive pairs and strengthen the discriminative power of contrastive learning. We also identify the limitation of noise-contrastive estimation (NCE) loss for relation representation learning and propose to apply margin loss for sentence pairs. Experiments on NYT-FB and TACRED datasets demonstrate that the proposed relation representation learning and a simple K-Means clustering achieves state-of-the-art performance.
Anthology ID:
2023.emnlp-main.745
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12136–12147
Language:
URL:
https://aclanthology.org/2023.emnlp-main.745
DOI:
10.18653/v1/2023.emnlp-main.745
Bibkey:
Cite (ACL):
Qing Wang, Kang Zhou, Qiao Qiao, Yuepei Li, and Qi Li. 2023. Improving Unsupervised Relation Extraction by Augmenting Diverse Sentence Pairs. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 12136–12147, Singapore. Association for Computational Linguistics.
Cite (Informal):
Improving Unsupervised Relation Extraction by Augmenting Diverse Sentence Pairs (Wang et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.745.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.745.mp4