Yantuan Xian
2024
Does Large Language Model Contain Task-Specific Neurons?
Ran Song
|
Shizhu He
|
Shuting Jiang
|
Yantuan Xian
|
Shengxiang Gao
|
Kang Liu
|
Zhengtao Yu
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Large language models (LLMs) have demonstrated remarkable capabilities in comprehensively handling various types of natural language processing (NLP) tasks. However, there are significant differences in the knowledge and abilities required for different tasks. Therefore, it is important to understand whether the same LLM processes different tasks in the same way. Are there specific neurons in a LLM for different tasks? Inspired by neuroscience, this paper pioneers the exploration of whether distinct neurons are activated when a LLM handles different tasks. Compared with current research exploring the neurons of language and knowledge, task-specific neurons present a greater challenge due to their abstractness, diversity, and complexity. To address these challenges, this paper proposes a method for task-specific neuron localization based on Causal Gradient Variation with Special Tokens (CGVST). CGVST identifies task-specific neurons by concentrating on the most significant tokens during task processing, thereby eliminating redundant tokens and minimizing interference from non-essential neurons. Compared to traditional neuron localization methods, our approach can more effectively identify task-specific neurons. We conduct experiments across eight different public tasks. Experiments involving the inhibition and amplification of identified neurons demonstrate that our method can accurately locate task-specific neurons.
2012
Chinese Name Disambiguation Based on Adaptive Clustering with the Attribute Features
Wei Tian
|
Xiao Pan
|
Zhengtao Yu
|
Yantuan Xian
|
Xiuzhen Yang
Proceedings of the Second CIPS-SIGHAN Joint Conference on Chinese Language Processing
Search
Co-authors
- Zhengtao Yu 2
- Ran Song 1
- Shizhu He 1
- Shuting Jiang 1
- Shengxiang Gao 1
- show all...