Incorporating Context into Subword Vocabularies

Shaked Yehezkel, Yuval Pinter


Abstract
Most current popular subword tokenizers are trained based on word frequency statistics over a corpus, without considering information about co-occurrence or context. Nevertheless, the resulting vocabularies are used in language models’ highly contextualized settings. We present SaGe, a tokenizer that tailors subwords for their downstream use by baking in the contextualized signal at the vocabulary creation phase. We show that SaGe does a better job than current widespread tokenizers in keeping token contexts cohesive, while not incurring a large price in terms of encoding efficiency or domain robustness. SaGe improves performance on English GLUE classification tasks as well as on NER, and on Inference and NER in Turkish, demonstrating its robustness to language properties such as morphological exponence and agglutination.
Anthology ID:
2023.eacl-main.45
Volume:
Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
623–635
Language:
URL:
https://aclanthology.org/2023.eacl-main.45
DOI:
10.18653/v1/2023.eacl-main.45
Bibkey:
Cite (ACL):
Shaked Yehezkel and Yuval Pinter. 2023. Incorporating Context into Subword Vocabularies. In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics, pages 623–635, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Incorporating Context into Subword Vocabularies (Yehezkel & Pinter, EACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.eacl-main.45.pdf
Video:
 https://aclanthology.org/2023.eacl-main.45.mp4