InstructEd: Soft-Instruction Tuning for Model Editing with Hops

XiaoQi Han, Ru Li, Xiaoli Li, Jiye Liang, Zifang Zhang, Jeff Pan


Abstract
The task of model editing becomes popular for correcting inaccurate or outdated parametric knowledge in Large Language Models (LLMs). However, there are major limitations of state of the art (SOTA) model editing methods, including the excessive memorization issue caused by the direct editing methods, as well as the error propagation and knowledge conflict issues from the memory enhancement methods, resulting in hindering models’ *portability*, e.g., the ability to transfer the new knowledge to related one-hop or multi-hop content. To address these issues, we propose the InstructEd method, the idea of which is to insert soft instructions into the attention module so as to facilitate interactions between instructions and questions and to understand and utilize new facts. Our main findings are: (i) InstructEd has achieved SOTA performance on three datasets for one-hop/multi-hop evaluation with LLaMAs and GPT2, achieving 10% (5%) improvement in one-hop (multi-hop) model editing.(ii) Different from earlier methods on editing parameters in FFN, we show that editing attention can also help. (iii) Model editing is highly related to retrieval augmented methods, which can help improve the locality of model editing while slightly decrease the editing performance with hops.
Anthology ID:
2024.findings-acl.888
Volume:
Findings of the Association for Computational Linguistics: ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14953–14968
Language:
URL:
https://aclanthology.org/2024.findings-acl.888
DOI:
10.18653/v1/2024.findings-acl.888
Bibkey:
Cite (ACL):
XiaoQi Han, Ru Li, Xiaoli Li, Jiye Liang, Zifang Zhang, and Jeff Pan. 2024. InstructEd: Soft-Instruction Tuning for Model Editing with Hops. In Findings of the Association for Computational Linguistics: ACL 2024, pages 14953–14968, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
InstructEd: Soft-Instruction Tuning for Model Editing with Hops (Han et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-acl.888.pdf