Improving Sequential Model Editing with Fact Retrieval

Xiaoqi Han, Ru Li, Hongye Tan, Wang Yuanlong, Qinghua Chai, Jeff Pan


Abstract
The task of sequential model editing is to fix erroneous knowledge in Pre-trained Language Models (PLMs) efficiently, precisely and continuously. Although existing methods can deal with a small number of modifications, these methods experience a performance decline or require additional annotated data, when the number of edits increases. In this paper, we propose a Retrieval Augmented Sequential Model Editing framework (RASE) that leverages factual information to enhance editing generalization and to guide the identification of edits by retrieving related facts from the fact-patch memory we constructed. Our main findings are: (i) State-of-the-art models can hardly correct massive mistakes stably and efficiently; (ii) Even if we scale up to thousands of edits, RASE can significantly enhance editing generalization and maintain consistent performance and efficiency; (iii) RASE can edit large-scale PLMs and increase the performance of different editors. Moreover, it can integrate with ChatGPT and further improve performance. Our code and data are available at: https://github.com/sev777/RASE.
Anthology ID:
2023.findings-emnlp.749
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11209–11224
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.749
DOI:
10.18653/v1/2023.findings-emnlp.749
Bibkey:
Cite (ACL):
Xiaoqi Han, Ru Li, Hongye Tan, Wang Yuanlong, Qinghua Chai, and Jeff Pan. 2023. Improving Sequential Model Editing with Fact Retrieval. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 11209–11224, Singapore. Association for Computational Linguistics.
Cite (Informal):
Improving Sequential Model Editing with Fact Retrieval (Han et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.749.pdf