Pavel Chizhov


2024

pdf bib
BPE Gets Picky: Efficient Vocabulary Refinement During Tokenizer Training
Pavel Chizhov | Catherine Arnett | Elizaveta Korotkova | Ivan Yamshchikov
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing

Language models can greatly benefit from efficient tokenization. However, they still mostly utilize the classical Byte-Pair Encoding (BPE) algorithm, a simple and reliable method. BPE has been shown to cause such issues as under-trained tokens and sub-optimal compression that may affect the downstream performance. We introduce PickyBPE, a modified BPE algorithm that carries out vocabulary refinement during tokenizer training by removing merges that leave intermediate “junk” tokens. Our method improves vocabulary efficiency, eliminates under-trained tokens, and does not compromise text compression. Our experiments show that this method either improves downstream performance or does not harm it.