ARXSA: A General Negative Feedback Control Theory in Vision-Language Models

Zeyu Zhang, Tianqi Chen, Yuki Todo


Abstract
The Transformer model has been increasingly applied across various domains, driven by the self-attention mechanism, which offers robust data processing capabilities and has substantially contributed to the advancement of the model. In the self-attention mechanism, three core matrices from the same data batch are computed together to determine correlations between input elements. Drawing inspiration from the efficiency and stability conferred by negative feedback structures in predictive control systems, the concept of vertical training was introduced to integrate data from multiple batches. Accordingly, this paper proposes an autoregressive with exogenous inputs (ARX) approach for the self-attention mechanism, transforming the Encoder block into a negative feedback predictive control system. A network architecture based on this method is also proposed, enabling the autoregressive with exogenous inputs for self-attention to transmit data from batches at previous time points. The effectiveness of the proposed approach is validated through comparative experimental evaluations.
Anthology ID:
2025.findings-emnlp.591
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11100–11110
Language:
URL:
https://aclanthology.org/2025.findings-emnlp.591/
DOI:
Bibkey:
Cite (ACL):
Zeyu Zhang, Tianqi Chen, and Yuki Todo. 2025. ARXSA: A General Negative Feedback Control Theory in Vision-Language Models. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 11100–11110, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
ARXSA: A General Negative Feedback Control Theory in Vision-Language Models (Zhang et al., Findings 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.findings-emnlp.591.pdf
Checklist:
 2025.findings-emnlp.591.checklist.pdf