Extracting Multi-valued Relations from Language Models

Sneha Singhania, Simon Razniewski, Gerhard Weikum


Abstract
The widespread usage of latent language representations via pre-trained language models (LMs) suggests that they are a promising source of structured knowledge. However, existing methods focus only on a single object per subject-relation pair, even though often multiple objects are correct. To overcome this limitation, we analyze these representations for their potential to yield materialized multi-object relational knowledge. We formulate the problem as a rank-then-select task. For ranking candidate objects, we evaluate existing prompting techniques and propose new ones incorporating domain knowledge. Among the selection methods, we find that choosing objects with a likelihood above a learned relation-specific threshold gives a 49.5% F1 score. Our results highlight the difficulty of employing LMs for the multi-valued slot-filling task, and pave the way for further research on extracting relational knowledge from latent language representations.
Anthology ID:
2023.repl4nlp-1.12
Volume:
Proceedings of the 8th Workshop on Representation Learning for NLP (RepL4NLP 2023)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Burcu Can, Maximilian Mozes, Samuel Cahyawijaya, Naomi Saphra, Nora Kassner, Shauli Ravfogel, Abhilasha Ravichander, Chen Zhao, Isabelle Augenstein, Anna Rogers, Kyunghyun Cho, Edward Grefenstette, Lena Voita
Venue:
RepL4NLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
139–154
Language:
URL:
https://aclanthology.org/2023.repl4nlp-1.12
DOI:
10.18653/v1/2023.repl4nlp-1.12
Bibkey:
Cite (ACL):
Sneha Singhania, Simon Razniewski, and Gerhard Weikum. 2023. Extracting Multi-valued Relations from Language Models. In Proceedings of the 8th Workshop on Representation Learning for NLP (RepL4NLP 2023), pages 139–154, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Extracting Multi-valued Relations from Language Models (Singhania et al., RepL4NLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.repl4nlp-1.12.pdf