TAXI: Evaluating Categorical Knowledge Editing for Language Models

Derek Powell, Walter Gerych, Thomas Hartvigsen


Abstract
Humans rarely learn one fact in isolation. Instead, learning a new fact induces knowledge of other facts about the world. For example, in learning a korat is a type of cat, you also infer it is a mammal and has claws, ensuring your model of the world is consistent. Knowledge editing aims to inject new facts into language models to improve their factuality, but current benchmarks fail to evaluate consistency, which is critical to ensure efficient, accurate, and generalizable edits. We manually create TAXI, a new benchmark dataset specifically created to evaluate consistency in categorical knowledge edits. TAXI contains 11,120 multiple-choice queries for 976 edits spanning 41 categories (e.g., Dogs), 164 subjects (e.g., Labrador), and 183 properties (e.g., is a mammal). We then use TAXI to evaluate popular editors’ categorical consistency, measuring how often editing a subject’s category appropriately edits its properties. We find that 1) the editors achieve marginal, yet non-random consistency, 2) their consistency far underperforms human baselines, and 3) consistency is more achievable when editing atypical subjects.
Anthology ID:
2024.findings-acl.906
Volume:
Findings of the Association for Computational Linguistics ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand and virtual meeting
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
15343–15352
Language:
URL:
https://aclanthology.org/2024.findings-acl.906
DOI:
Bibkey:
Cite (ACL):
Derek Powell, Walter Gerych, and Thomas Hartvigsen. 2024. TAXI: Evaluating Categorical Knowledge Editing for Language Models. In Findings of the Association for Computational Linguistics ACL 2024, pages 15343–15352, Bangkok, Thailand and virtual meeting. Association for Computational Linguistics.
Cite (Informal):
TAXI: Evaluating Categorical Knowledge Editing for Language Models (Powell et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-acl.906.pdf