Kerun Yu
2024
SEGMENT+: Long Text Processing with Short-Context Language Models
Wei Shi
|
Shuang Li
|
Kerun Yu
|
Jinglei Chen
|
Zujie Liang
|
Xinhui Wu
|
Yuxi Qian
|
Feng Wei
|
Bo Zheng
|
Jiaqing Liang
|
Jiangjie Chen
|
Yanghua Xiao
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
There is a growing interest in expanding the input capacity of language models (LMs) across various domains. However, simply increasing the context window does not guarantee robust performance across diverse long-input processing tasks, such as understanding extensive documents and extracting detailed information from lengthy and noisy data. In response, we introduce Segment+, a general framework that enables LMs to handle extended inputs within limited context windows efficiently. Segment+ utilizes structured notes and a filtering module to manage information flow, resulting in a system that is both controllable and interpretable. Our extensive experiments across various model sizes, focusing on long-document question-answering and Needle-in-a-Haystack tasks, demonstrate the effectiveness of Segment+ in improving performance.
Search
Co-authors
- Wei Shi 1
- Shu’ang Li 1
- Jinglei Chen 1
- Zujie Liang 1
- Xinhui Wu 1
- show all...