Seeking Clozure: Robust Hypernym extraction from BERT with Anchored Prompts

Chunhua Liu, Trevor Cohn, Lea Frermann


Abstract
The automatic extraction of hypernym knowledge from large language models like BERT is an open problem, and it is unclear whether methods fail due to a lack of knowledge in the model or shortcomings of the extraction methods. In particular, methods fail on challenging cases which include rare or abstract concepts, and perform inconsistently under paraphrased prompts. In this study, we revisit the long line of work on pattern-based hypernym extraction, and use it as a diagnostic tool to thoroughly examine the hypernomy knowledge encoded in BERT and the limitations of hypernym extraction methods. We propose to construct prompts from established pattern structures: definitional (X is a Y); lexico-syntactic (Y such as X); and their anchored versions (Y such as X or Z). We devise an automatic method for anchor prediction, and compare different patterns in: (i) their effectiveness for hypernym retrieval from BERT across six English data sets; (ii) on challenge sets of rare and abstract concepts; and (iii) on consistency under paraphrasing. We show that anchoring is particularly useful for abstract concepts and in enhancing consistency across paraphrases, demonstrating how established methods in the field can inform prompt engineering.
Anthology ID:
2023.starsem-1.18
Volume:
Proceedings of the 12th Joint Conference on Lexical and Computational Semantics (*SEM 2023)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Alexis Palmer, Jose Camacho-collados
Venue:
*SEM
SIG:
SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
193–206
Language:
URL:
https://aclanthology.org/2023.starsem-1.18
DOI:
10.18653/v1/2023.starsem-1.18
Bibkey:
Cite (ACL):
Chunhua Liu, Trevor Cohn, and Lea Frermann. 2023. Seeking Clozure: Robust Hypernym extraction from BERT with Anchored Prompts. In Proceedings of the 12th Joint Conference on Lexical and Computational Semantics (*SEM 2023), pages 193–206, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Seeking Clozure: Robust Hypernym extraction from BERT with Anchored Prompts (Liu et al., *SEM 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.starsem-1.18.pdf