MWE for Essay Scoring English as a Foreign Language

Rodrigo Wilkens, Daiane Seibert, Xiaoou Wang, Thomas François


Abstract
Mastering a foreign language like English can bring better opportunities. In this context, although multiword expressions (MWE) are associated with proficiency, they are usually neglected in the works of automatic scoring language learners. Therefore, we study MWE-based features (i.e., occurrence and concreteness) in this work, aiming at assessing their relevance for automated essay scoring. To achieve this goal, we also compare MWE features with other classic features, such as length-based, graded resource, orthographic neighbors, part-of-speech, morphology, dependency relations, verb tense, language development, and coherence. Although the results indicate that classic features are more significant than MWE for automatic scoring, we observed encouraging results when looking at the MWE concreteness through the levels.
Anthology ID:
2022.readi-1.9
Volume:
Proceedings of the 2nd Workshop on Tools and Resources to Empower People with REAding DIfficulties (READI) within the 13th Language Resources and Evaluation Conference
Month:
June
Year:
2022
Address:
Marseille, France
Editors:
Rodrigo Wilkens, David Alfter, Rémi Cardon, Núria Gala
Venue:
READI
SIG:
Publisher:
European Language Resources Association
Note:
Pages:
62–69
Language:
URL:
https://aclanthology.org/2022.readi-1.9
DOI:
Bibkey:
Cite (ACL):
Rodrigo Wilkens, Daiane Seibert, Xiaoou Wang, and Thomas François. 2022. MWE for Essay Scoring English as a Foreign Language. In Proceedings of the 2nd Workshop on Tools and Resources to Empower People with REAding DIfficulties (READI) within the 13th Language Resources and Evaluation Conference, pages 62–69, Marseille, France. European Language Resources Association.
Cite (Informal):
MWE for Essay Scoring English as a Foreign Language (Wilkens et al., READI 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.readi-1.9.pdf