How do different tokenizers perform on downstream tasks in scriptio continua languages?: A case study in Japanese

Takuro Fujii, Koki Shibata, Atsuki Yamaguchi, Terufumi Morishita, Yasuhiro Sogawa


Abstract
This paper investigates the effect of tokenizers on the downstream performance of pretrained language models (PLMs) in scriptio continua languages where no explicit spaces exist between words, using Japanese as a case study. The tokenizer for such languages often consists of a morphological analyzer and a subword tokenizer, requiring us to conduct a comprehensive study of all possible pairs. However, previous studies lack this comprehensiveness. We therefore train extensive sets of tokenizers, build a PLM using each, and measure the downstream performance on a wide range of tasks. Our results demonstrate that each downstream task has a different optimal morphological analyzer, and that it is better to use Byte-Pair-Encoding or Unigram rather than WordPiece as a subword tokenizer, regardless of the type of task.
Anthology ID:
2023.acl-srw.5
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 4: Student Research Workshop)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Vishakh Padmakumar, Gisela Vallejo, Yao Fu
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
39–49
Language:
URL:
https://aclanthology.org/2023.acl-srw.5
DOI:
10.18653/v1/2023.acl-srw.5
Bibkey:
Cite (ACL):
Takuro Fujii, Koki Shibata, Atsuki Yamaguchi, Terufumi Morishita, and Yasuhiro Sogawa. 2023. How do different tokenizers perform on downstream tasks in scriptio continua languages?: A case study in Japanese. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 4: Student Research Workshop), pages 39–49, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
How do different tokenizers perform on downstream tasks in scriptio continua languages?: A case study in Japanese (Fujii et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-srw.5.pdf
Video:
 https://aclanthology.org/2023.acl-srw.5.mp4