FragRel: Exploiting Fragment-level Relations in the External Memory of Large Language Models

Xihang Yue, Linchao Zhu, Yi Yang


Abstract
To process contexts with unlimited length using Large Language Models (LLMs), recent studies explore hierarchically managing the long text. Only several text fragments are taken from the external memory and passed into the temporary working memory, i.e., LLM’s context window. However, existing approaches isolatedly handle the text fragments without considering their structural connections, thereby suffering limited capability on texts with intensive inter-relations, e.g., coherent stories and code repositories. This work attempts to resolve this by exploiting the fragment-level relations in external memory. First, we formulate the fragment-level relations and present several instantiations for different text types. Next, we introduce a relation-aware fragment assessment criteria upon previous independent fragment assessment. Finally, we present the fragment-connected Hierarchical Memory based LLM. We validate the benefits of involving these relations on long story understanding, repository-level code generation, and long-term chatting.
Anthology ID:
2024.findings-acl.968
Volume:
Findings of the Association for Computational Linguistics: ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
16348–16361
Language:
URL:
https://aclanthology.org/2024.findings-acl.968
DOI:
10.18653/v1/2024.findings-acl.968
Bibkey:
Cite (ACL):
Xihang Yue, Linchao Zhu, and Yi Yang. 2024. FragRel: Exploiting Fragment-level Relations in the External Memory of Large Language Models. In Findings of the Association for Computational Linguistics: ACL 2024, pages 16348–16361, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
FragRel: Exploiting Fragment-level Relations in the External Memory of Large Language Models (Yue et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-acl.968.pdf