Relation Extraction with Weighted Contrastive Pre-training on Distant Supervision

Zhen Wan, Fei Cheng, Qianying Liu, Zhuoyuan Mao, Haiyue Song, Sadao Kurohashi


Abstract
Contrastive pre-training on distant supervision has shown remarkable effectiveness in improving supervised relation extraction tasks. However, the existing methods ignore the intrinsic noise of distant supervision during the pre-training stage. In this paper, we propose a weighted contrastive learning method by leveraging the supervised data to estimate the reliability of pre-training instances and explicitly reduce the effect of noise. Experimental results on three supervised datasets demonstrate the advantages of our proposed weighted contrastive learning approach compared to two state-of-the-art non-weighted baselines. Our code and models are available at: https://github.com/YukinoWan/WCL.
Anthology ID:
2023.findings-eacl.195
Volume:
Findings of the Association for Computational Linguistics: EACL 2023
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2580–2585
Language:
URL:
https://aclanthology.org/2023.findings-eacl.195
DOI:
10.18653/v1/2023.findings-eacl.195
Bibkey:
Cite (ACL):
Zhen Wan, Fei Cheng, Qianying Liu, Zhuoyuan Mao, Haiyue Song, and Sadao Kurohashi. 2023. Relation Extraction with Weighted Contrastive Pre-training on Distant Supervision. In Findings of the Association for Computational Linguistics: EACL 2023, pages 2580–2585, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Relation Extraction with Weighted Contrastive Pre-training on Distant Supervision (Wan et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-eacl.195.pdf
Video:
 https://aclanthology.org/2023.findings-eacl.195.mp4