A Multi-dimensional Evaluation of Tokenizer-free Multilingual Pretrained Models

Jimin Sun, Patrick Fernandes, Xinyi Wang, Graham Neubig


Abstract
Recent works on tokenizer-free multilingual pretrained models show promising results in improving cross-lingual transfer and reducing engineering overhead compared to subword-based alternatives. However, previous work mainly focuses on reporting accuracy on a limited set of tasks and data settings, placing less emphasis on other important factors when tuning and deploying the models in practice, such as memory usage, inference speed, and finetuning data efficiency. We attempt to fill this gap by performing a comprehensive empirical comparison of multilingual tokenizer-free and subword-based models considering the various dimensions. Surprisingly, we find that subword-based models might still be the most practical choice in many settings, achieving better performance for lower inference latency and memory usage. Based on these results, we encourage future work in tokenizer-free methods to consider these factors when designing and evaluating new models.
Anthology ID:
2023.findings-eacl.128
Volume:
Findings of the Association for Computational Linguistics: EACL 2023
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1725–1735
Language:
URL:
https://aclanthology.org/2023.findings-eacl.128
DOI:
10.18653/v1/2023.findings-eacl.128
Bibkey:
Cite (ACL):
Jimin Sun, Patrick Fernandes, Xinyi Wang, and Graham Neubig. 2023. A Multi-dimensional Evaluation of Tokenizer-free Multilingual Pretrained Models. In Findings of the Association for Computational Linguistics: EACL 2023, pages 1725–1735, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
A Multi-dimensional Evaluation of Tokenizer-free Multilingual Pretrained Models (Sun et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-eacl.128.pdf
Video:
 https://aclanthology.org/2023.findings-eacl.128.mp4