Zero-shot Cross-Lingual Transfer for Synthetic Data Generation in Grammatical Error Detection

Gaetan Latouche, Marc-André Carbonneau, Benjamin Swanson


Abstract
Grammatical Error Detection (GED) methods rely heavily on human annotated error corpora. However, these annotations are unavailable in many low-resource languages. In this paper, we investigate GED in this context. Leveraging the zero-shot cross-lingual transfer capabilities of multilingual pre-trained language models, we train a model using data from a diverse set of languages to generate synthetic errors in other languages. These synthetic error corpora are then used to train a GED model. Specifically we propose a two-stage fine-tuning pipeline where the GED model is first fine-tuned on multilingual synthetic data from target languages followed by fine-tuning on human-annotated GED corpora from source languages. This approach outperforms current state-of-the-art annotation-free GED methods. We also analyse the errors produced by our method and other strong baselines, finding that our approach produces errors that are more diverse and more similar to human errors.
Anthology ID:
2024.emnlp-main.176
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3002–3016
Language:
URL:
https://aclanthology.org/2024.emnlp-main.176
DOI:
Bibkey:
Cite (ACL):
Gaetan Latouche, Marc-André Carbonneau, and Benjamin Swanson. 2024. Zero-shot Cross-Lingual Transfer for Synthetic Data Generation in Grammatical Error Detection. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 3002–3016, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Zero-shot Cross-Lingual Transfer for Synthetic Data Generation in Grammatical Error Detection (Latouche et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.176.pdf