Language Models for German Text Simplification: Overcoming Parallel Data Scarcity through Style-specific Pre-training

Miriam Anschütz, Joshua Oehms, Thomas Wimmer, Bartłomiej Jezierski, Georg Groh


Abstract
Automatic text simplification systems help to reduce textual information barriers on the internet. However, for languages other than English, only few parallel data to train these systems exists. We propose a two-step approach to overcome this data scarcity issue. First, we fine-tuned language models on a corpus of German Easy Language, a specific style of German. Then, we used these models as decoders in a sequence-to-sequence simplification task. We show that the language models adapt to the style characteristics of Easy Language and output more accessible texts. Moreover, with the style-specific pre-training, we reduced the number of trainable parameters in text simplification models. Hence, less parallel data is sufficient for training. Our results indicate that pre-training on unaligned data can reduce the required parallel data while improving the performance on downstream tasks.
Anthology ID:
2023.findings-acl.74
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1147–1158
Language:
URL:
https://aclanthology.org/2023.findings-acl.74
DOI:
10.18653/v1/2023.findings-acl.74
Bibkey:
Cite (ACL):
Miriam Anschütz, Joshua Oehms, Thomas Wimmer, Bartłomiej Jezierski, and Georg Groh. 2023. Language Models for German Text Simplification: Overcoming Parallel Data Scarcity through Style-specific Pre-training. In Findings of the Association for Computational Linguistics: ACL 2023, pages 1147–1158, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Language Models for German Text Simplification: Overcoming Parallel Data Scarcity through Style-specific Pre-training (Anschütz et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.74.pdf
Video:
 https://aclanthology.org/2023.findings-acl.74.mp4