CSCD-NS: a Chinese Spelling Check Dataset for Native Speakers

Yong Hu, Fandong Meng, Jie Zhou


Abstract
In this paper, we present CSCD-NS, the first Chinese spelling check (CSC) dataset designed for native speakers, containing 40,000 samples from a Chinese social platform. Compared with existing CSC datasets aimed at Chinese learners, CSCD-NS is ten times larger in scale and exhibits a distinct error distribution, with a significantly higher proportion of word-level errors. To further enhance the data resource, we propose a novel method that simulates the input process through an input method, generating large-scale and high-quality pseudo data that closely resembles the actual error distribution and outperforms existing methods. Moreover, we investigate the performance of various models in this scenario, including large language models (LLMs), such as ChatGPT. The result indicates that generative models underperform BERT-like classification models due to strict length and pronunciation constraints. The high prevalence of word-level errors also makes CSC for native speakers challenging enough, leaving substantial room for improvement.
Anthology ID:
2024.acl-long.10
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
146–159
Language:
URL:
https://aclanthology.org/2024.acl-long.10
DOI:
10.18653/v1/2024.acl-long.10
Bibkey:
Cite (ACL):
Yong Hu, Fandong Meng, and Jie Zhou. 2024. CSCD-NS: a Chinese Spelling Check Dataset for Native Speakers. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 146–159, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
CSCD-NS: a Chinese Spelling Check Dataset for Native Speakers (Hu et al., ACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.acl-long.10.pdf