DLUE: Benchmarking Document Language Understanding

Xu Ruoxi, Lin Hongyu, Guan Xinyan, Sun Yingfei, Sun Le


Abstract
“Understanding documents is central to many real-world tasks but remains a challenging topic.Unfortunately, there is no well-established consensus on how to comprehensively evaluate docu-ment understanding abilities, which significantly hinders the fair comparison and measuring theprogress of the field. To benchmark document understanding researches, this paper summarizesfour representative abilities, i.e., document classification, document structural analysis, docu-ment information extraction, and document transcription. Under the new evaluation framework,we propose Document Language Understanding Evaluation – DLUE, a new task suite whichcovers a wide-range of tasks in various forms, domains and document genres. We also systemat-ically evaluate six well-established transformer models and representative LLMs on DLUE, andfind that due to the lengthy content, complicated underlying structure and dispersed knowledge,document understanding is still far from being solved in complex real-world scenarios.”
Anthology ID:
2024.ccl-1.97
Volume:
Proceedings of the 23rd Chinese National Conference on Computational Linguistics (Volume 1: Main Conference)
Month:
July
Year:
2024
Address:
Taiyuan, China
Editors:
Maosong Sun, Jiye Liang, Xianpei Han, Zhiyuan Liu, Yulan He
Venue:
CCL
SIG:
Publisher:
Chinese Information Processing Society of China
Note:
Pages:
1257–1269
Language:
English
URL:
https://aclanthology.org/2024.ccl-1.97/
DOI:
Bibkey:
Cite (ACL):
Xu Ruoxi, Lin Hongyu, Guan Xinyan, Sun Yingfei, and Sun Le. 2024. DLUE: Benchmarking Document Language Understanding. In Proceedings of the 23rd Chinese National Conference on Computational Linguistics (Volume 1: Main Conference), pages 1257–1269, Taiyuan, China. Chinese Information Processing Society of China.
Cite (Informal):
DLUE: Benchmarking Document Language Understanding (Ruoxi et al., CCL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.ccl-1.97.pdf