Better Call SAUL: Fluent and Consistent Language Model Editing with Generation Regularization

Mingyang Wang, Lukas Lange, Heike Adel, Jannik Strötgen, Hinrich Schuetze


Abstract
To ensure large language models contain up-to-date knowledge, they need to be updated regularly. However, model editing is challenging as it might also affect knowledge that is unrelated to the new data. State-of-the-art methods identify parameters associated with specific knowledge and then modify them via direct weight updates. However, these locate-and-edit methods suffer from heavy computational overhead and lack theoretical validation. In contrast, directly fine-tuning the model on requested edits affects the model’s behavior on unrelated knowledge, and significantly damages the model’s generation fluency and consistency. To address these challenges, we propose SAUL, a streamlined model editing method that uses sentence concatenation with augmented random facts for generation regularization. Evaluations on three model editing benchmarks show that is a practical and reliable solution for model editing outperforming state-of-the-art methods while maintaining generation quality and reducing computational overhead.
Anthology ID:
2024.findings-emnlp.469
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2024
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7990–8000
Language:
URL:
https://aclanthology.org/2024.findings-emnlp.469
DOI:
Bibkey:
Cite (ACL):
Mingyang Wang, Lukas Lange, Heike Adel, Jannik Strötgen, and Hinrich Schuetze. 2024. Better Call SAUL: Fluent and Consistent Language Model Editing with Generation Regularization. In Findings of the Association for Computational Linguistics: EMNLP 2024, pages 7990–8000, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Better Call SAUL: Fluent and Consistent Language Model Editing with Generation Regularization (Wang et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-emnlp.469.pdf