Long Sequence Modeling with Attention Tensorization: From Sequence to Tensor Learning

Aosong Feng, Rex Ying, Leandros Tassiulas


Abstract
As the demand for processing extended textual data grows, the ability to handle long-range dependencies and maintain computational efficiency is more critical than ever. One of the key issues for long-sequence modeling using attention-based model is the mismatch between the limited-range modeling power of full attention and the long-range token dependency in the input sequence. In this work, we propose to scale up the attention receptive field by tensorizing long input sequences into compact tensor representations followed by attention on each transformed dimension. The resulting Tensorized Attention can be adopted as efficient transformer backbones to extend input context length with improved memory and time efficiency. We show that the proposed attention tensorization encodes token dependencies as a multi-hop attention process, and is equivalent to Kronecker decomposition of full attention. Extensive experiments show that tensorized attention can be used to adapt pretrained LLMs with improved efficiency. Notably, using customized Triton kernels, tensorization enables Llama-8B training under 32,768 context length and can steadily extrapolate to 128k length during inference with 11 times speedup (compared to full attention with FlashAttention-2).
Anthology ID:
2024.findings-emnlp.858
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2024
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14642–14655
Language:
URL:
https://aclanthology.org/2024.findings-emnlp.858
DOI:
Bibkey:
Cite (ACL):
Aosong Feng, Rex Ying, and Leandros Tassiulas. 2024. Long Sequence Modeling with Attention Tensorization: From Sequence to Tensor Learning. In Findings of the Association for Computational Linguistics: EMNLP 2024, pages 14642–14655, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Long Sequence Modeling with Attention Tensorization: From Sequence to Tensor Learning (Feng et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-emnlp.858.pdf