Investigating the Impact of Different Graph Representations for Relation Extraction with Graph Neural Networks

Moritz Blum, Gennaro Nolano, Basil Ell, Philipp Cimiano


Abstract
Graph Neural Networks(GNNs) have been applied successfully to various NLP tasks, particularly Relation Extraction(RE). Even though most of these approaches rely on the syntactic dependency tree of a sentence to derive a graph representation, the impact of this choice compared to other possible graph representations has not been evaluated. We examine the effect of representing text though a graph of different graph representations for GNNs that are applied to RE, considering, e.g., a fully connected graph of tokens, of semantic role structures, and combinations thereof. We further examine the impact of background knowledge injection from Knowledge Graphs(KGs) into the graph representation to achieve enhanced graph representations. Our results show that combining multiple graph representations can improve the model’s predictions. Moreover, the integration of background knowledge positively impacts scores, as enhancing the text graphs with Wikidata features or WordNet features can lead to an improvement of close to 0.1 points in F1.
Anthology ID:
2024.dlnld-1.1
Volume:
Proceedings of the Workshop on Deep Learning and Linked Data (DLnLD) @ LREC-COLING 2024
Month:
May
Year:
2024
Address:
Torino, Italia
Editors:
Gilles Sérasset, Hugo Gonçalo Oliveira, Giedre Valunaite Oleskeviciene
Venues:
DLnLD | WS
SIG:
Publisher:
ELRA and ICCL
Note:
Pages:
1–13
Language:
URL:
https://aclanthology.org/2024.dlnld-1.1
DOI:
Bibkey:
Cite (ACL):
Moritz Blum, Gennaro Nolano, Basil Ell, and Philipp Cimiano. 2024. Investigating the Impact of Different Graph Representations for Relation Extraction with Graph Neural Networks. In Proceedings of the Workshop on Deep Learning and Linked Data (DLnLD) @ LREC-COLING 2024, pages 1–13, Torino, Italia. ELRA and ICCL.
Cite (Informal):
Investigating the Impact of Different Graph Representations for Relation Extraction with Graph Neural Networks (Blum et al., DLnLD-WS 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.dlnld-1.1.pdf