Kaijie Mo


2024

pdf bib
ExpertEase: A Multi-Agent Framework for Grade-Specific Document Simplification with Large Language Models
Kaijie Mo | Renfen Hu
Findings of the Association for Computational Linguistics: EMNLP 2024

Text simplification is crucial for making texts more accessible, yet current research primarily focuses on sentence-level simplification, neglecting document-level simplification and the different reading levels of target audiences. To bridge these gaps, we introduce ExpertEase, a multi-agent framework for grade-specific document simplification using Large Language Models (LLMs). ExpertEase simulates real-world text simplification by introducing expert, teacher, and student agents that cooperate on the task and rely on external tools for calibration. Experiments demonstrate that this multi-agent approach significantly enhances LLMs’ ability to simplify reading materials for diverse audiences. Furthermore, we evaluate the performance of LLMs varying in size and type, and compare LLM-generated texts with human-authored ones, highlighting their potential in educational resource development and guiding future research.
Search
Co-authors
Venues