Hierarchical Relation Extraction with Coarse-to-Fine Grained Attention

Xu Han, Pengfei Yu, Zhiyuan Liu, Maosong Sun, Peng Li


Abstract
Distantly supervised relation extraction employs existing knowledge graphs to automatically collect training data. While distant supervision is effective to scale relation extraction up to large-scale corpora, it inevitably suffers from the wrong labeling problem. Many efforts have been devoted to identifying valid instances from noisy data. However, most existing methods handle each relation in isolation, regardless of rich semantic correlations located in relation hierarchies. In this paper, we aim to incorporate the hierarchical information of relations for distantly supervised relation extraction and propose a novel hierarchical attention scheme. The multiple layers of our hierarchical attention scheme provide coarse-to-fine granularity to better identify valid instances, which is especially effective for extracting those long-tail relations. The experimental results on a large-scale benchmark dataset demonstrate that our models are capable of modeling the hierarchical information of relations and significantly outperform other baselines. The source code of this paper can be obtained from https://github.com/thunlp/HNRE.
Anthology ID:
D18-1247
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
2236–2245
Language:
URL:
https://aclanthology.org/D18-1247
DOI:
10.18653/v1/D18-1247
Bibkey:
Cite (ACL):
Xu Han, Pengfei Yu, Zhiyuan Liu, Maosong Sun, and Peng Li. 2018. Hierarchical Relation Extraction with Coarse-to-Fine Grained Attention. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 2236–2245, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Hierarchical Relation Extraction with Coarse-to-Fine Grained Attention (Han et al., EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1247.pdf
Code
 thunlp/HNRE