Controlling Pre-trained Language Models for Grade-Specific Text Simplification

Sweta Agrawal, Marine Carpuat


Abstract
Text simplification systems rewrite text to make it more readable while preserving its content. However, what makes a text easy to read depends on the intended readers. Recent work has shown that pre-trained language models can simplify text using a wealth of techniques to control output simplicity, ranging from specifying only the desired reading grade level, to directly specifying low-level edit operations. Yet it remains unclear how to set these control parameters in practice. Existing approaches set them at the corpus level, disregarding the complexity of individual inputs and considering only one level of output complexity. In this work, we conduct an empirical study to understand how different control mechanisms impact the adequacy and simplicity of text simplification systems. Based on these insights, we introduce a simple method that predicts the edit operations required for simplifying a text for a specific grade level on an instance-per-instance basis. This approach improves the quality of the simplified outputs over corpus-level search-based heuristics.
Anthology ID:
2023.emnlp-main.790
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12807–12819
Language:
URL:
https://aclanthology.org/2023.emnlp-main.790
DOI:
10.18653/v1/2023.emnlp-main.790
Bibkey:
Cite (ACL):
Sweta Agrawal and Marine Carpuat. 2023. Controlling Pre-trained Language Models for Grade-Specific Text Simplification. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 12807–12819, Singapore. Association for Computational Linguistics.
Cite (Informal):
Controlling Pre-trained Language Models for Grade-Specific Text Simplification (Agrawal & Carpuat, EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.790.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.790.mp4