Data-scarce Behavior Editing of Language Models

Joykirat Singh, Subhabrata Dutta, Tanmoy Chakraborty


Abstract
Large Language Models trained on web-scale text acquire language generation abilities that can solve a wide range of tasks, particularly when task knowledge is refined into the generative prior using in-context examples. However, spurious features learned from noisy data hinder their generalizability. Supervised fine-tuning can enhance task specificity but may lead to data inefficiency. Prior studies indicate that (i) noisy neural circuitries coexist with generalizable ones within LLMs, and (ii) finetuning typically enhances (or suppresses) existing abilities without introducing newer ones. Building upon these, we propose TaRot, a novel method for task adaptation. TaRot intervenes in the neural circuitries using learnable rotation matrices that are optimized using Bayesian optimization, on labelled samples in the order of standard few-shot prompting examples. Experiments on multiple classification and generation tasks using LLMs of varying sizes reveal the efficacy of TaRot, improving upon both zero- as well as few-shot performance, with average improvements (across models and tasks) of 15.6% and 14%, respectively.
Anthology ID:
2025.findings-emnlp.514
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9687–9701
Language:
URL:
https://aclanthology.org/2025.findings-emnlp.514/
DOI:
Bibkey:
Cite (ACL):
Joykirat Singh, Subhabrata Dutta, and Tanmoy Chakraborty. 2025. Data-scarce Behavior Editing of Language Models. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 9687–9701, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Data-scarce Behavior Editing of Language Models (Singh et al., Findings 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.findings-emnlp.514.pdf
Checklist:
 2025.findings-emnlp.514.checklist.pdf