Promoting Pre-trained LM with Linguistic Features on Automatic Readability Assessment

Shudi Hou, Simin Rao, Yu Xia, Sujian Li


Abstract
Automatic readability assessment (ARA) aims at classifying the readability level of a passage automatically. In the past, manually selected linguistic features are used to classify the passages. However, as the use of deep neural network surges, there is less work focusing on these linguistic features. Recently, many works integrate linguistic features with pre-trained language model (PLM) to make up for the information that PLMs are not good at capturing. Despite their initial success, insufficient analysis of the long passage characteristic of ARA has been done before. To further investigate the promotion of linguistic features on PLMs in ARA from the perspective of passage length, with commonly used linguistic features and abundant experiments, we find that: (1) Linguistic features promote PLMs in ARA mainly on long passages. (2) The promotion of the features on PLMs becomes less significant when the dataset size exceeds 750 passages. (3) By analyzing commonly used ARA datasets, we find Newsela is actually not suitable for ARA. Our code is available at https://github.com/recorderhou/linguistic-features-in-ARA.
Anthology ID:
2022.aacl-short.54
Volume:
Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 2: Short Papers)
Month:
November
Year:
2022
Address:
Online only
Editors:
Yulan He, Heng Ji, Sujian Li, Yang Liu, Chua-Hui Chang
Venues:
AACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
430–436
Language:
URL:
https://aclanthology.org/2022.aacl-short.54
DOI:
Bibkey:
Cite (ACL):
Shudi Hou, Simin Rao, Yu Xia, and Sujian Li. 2022. Promoting Pre-trained LM with Linguistic Features on Automatic Readability Assessment. In Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pages 430–436, Online only. Association for Computational Linguistics.
Cite (Informal):
Promoting Pre-trained LM with Linguistic Features on Automatic Readability Assessment (Hou et al., AACL-IJCNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.aacl-short.54.pdf
Software:
 2022.aacl-short.54.Software.rar