Target-Level Sentence Simplification as Controlled Paraphrasing

Tannon Kew, Sarah Ebling


Abstract
Automatic text simplification aims to reduce the linguistic complexity of a text in order to make it easier to understand and more accessible. However, simplified texts are consumed by a diverse array of target audiences and what might be appropriately simplified for one group of readers may differ considerably for another. In this work we investigate a novel formulation of sentence simplification as paraphrasing with controlled decoding. This approach aims to alleviate the major burden of relying on large amounts of in-domain parallel training data, while at the same time allowing for modular and adaptive simplification. According to automatic metrics, our approach performs competitively against baselines that prove more difficult to adapt to the needs of different target audiences or require significant amounts of complex-simple parallel aligned data.
Anthology ID:
2022.tsar-1.4
Volume:
Proceedings of the Workshop on Text Simplification, Accessibility, and Readability (TSAR-2022)
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates (Virtual)
Editors:
Sanja Štajner, Horacio Saggion, Daniel Ferrés, Matthew Shardlow, Kim Cheng Sheang, Kai North, Marcos Zampieri, Wei Xu
Venue:
TSAR
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
28–42
Language:
URL:
https://aclanthology.org/2022.tsar-1.4
DOI:
10.18653/v1/2022.tsar-1.4
Bibkey:
Cite (ACL):
Tannon Kew and Sarah Ebling. 2022. Target-Level Sentence Simplification as Controlled Paraphrasing. In Proceedings of the Workshop on Text Simplification, Accessibility, and Readability (TSAR-2022), pages 28–42, Abu Dhabi, United Arab Emirates (Virtual). Association for Computational Linguistics.
Cite (Informal):
Target-Level Sentence Simplification as Controlled Paraphrasing (Kew & Ebling, TSAR 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.tsar-1.4.pdf
Video:
 https://aclanthology.org/2022.tsar-1.4.mp4