Kanbun-LM: Reading and Translating Classical Chinese in Japanese Methods by Language Models

Hao Wang, Hirofumi Shimizu, Daisuke Kawahara


Abstract
Recent studies in natural language processing (NLP) have focused on modern languages and achieved state-of-the-art results in many tasks. Meanwhile, little attention has been paid to ancient texts and related tasks. Classical Chinese first came to Japan approximately 2,000 years ago. It was gradually adapted to a Japanese form called Kanbun-Kundoku (Kanbun) in Japanese reading and translating methods, which has significantly impacted Japanese literature. However, compared to the rich resources of ancient texts in mainland China, Kanbun resources remain scarce in Japan.To solve this problem, we construct the first Classical-Chinese-to-Kanbun dataset in the world. Furthermore, we introduce two tasks, character reordering and machine translation, both of which play a significant role in Kanbun comprehension. We also test the current language models on these tasks and discuss the best evaluation method by comparing the results with human scores. We release our code and dataset on GitHub.
Anthology ID:
2023.findings-acl.545
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8589–8601
Language:
URL:
https://aclanthology.org/2023.findings-acl.545
DOI:
10.18653/v1/2023.findings-acl.545
Bibkey:
Cite (ACL):
Hao Wang, Hirofumi Shimizu, and Daisuke Kawahara. 2023. Kanbun-LM: Reading and Translating Classical Chinese in Japanese Methods by Language Models. In Findings of the Association for Computational Linguistics: ACL 2023, pages 8589–8601, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Kanbun-LM: Reading and Translating Classical Chinese in Japanese Methods by Language Models (Wang et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.545.pdf
Video:
 https://aclanthology.org/2023.findings-acl.545.mp4