Guan Xinyan
2024
DLUE: Benchmarking Document Language Understanding
Xu Ruoxi
|
Lin Hongyu
|
Guan Xinyan
|
Sun Yingfei
|
Sun Le
Proceedings of the 23rd Chinese National Conference on Computational Linguistics (Volume 1: Main Conference)
“Understanding documents is central to many real-world tasks but remains a challenging topic.Unfortunately, there is no well-established consensus on how to comprehensively evaluate docu-ment understanding abilities, which significantly hinders the fair comparison and measuring theprogress of the field. To benchmark document understanding researches, this paper summarizesfour representative abilities, i.e., document classification, document structural analysis, docu-ment information extraction, and document transcription. Under the new evaluation framework,we propose Document Language Understanding Evaluation – DLUE, a new task suite whichcovers a wide-range of tasks in various forms, domains and document genres. We also systemat-ically evaluate six well-established transformer models and representative LLMs on DLUE, andfind that due to the lengthy content, complicated underlying structure and dispersed knowledge,document understanding is still far from being solved in complex real-world scenarios.”