Revisiting Token Dropping Strategy in Efficient BERT Pretraining

Qihuang Zhong, Liang Ding, Juhua Liu, Xuebo Liu, Min Zhang, Bo Du, Dacheng Tao


Abstract
Token dropping is a recently-proposed strategy to speed up the pretraining of masked language models, such as BERT, by skipping the computation of a subset of the input tokens at several middle layers. It can effectively reduce the training time without degrading much performance on downstream tasks. However, we empirically find that token dropping is prone to a semantic loss problem and falls short in handling semantic-intense tasks. Motivated by this, we propose a simple yet effective semantic-consistent learning method (ScTD) to improve the token dropping. ScTD aims to encourage the model to learn how to preserve the semantic information in the representation space. Extensive experiments on 12 tasks show that, with the help of our ScTD, token dropping can achieve consistent and significant performance gains across all task types and model sizes. More encouragingly, ScTD saves up to 57% of pretraining time and brings up to +1.56% average improvement over the vanilla token dropping.
Anthology ID:
2023.acl-long.579
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10391–10405
Language:
URL:
https://aclanthology.org/2023.acl-long.579
DOI:
10.18653/v1/2023.acl-long.579
Bibkey:
Cite (ACL):
Qihuang Zhong, Liang Ding, Juhua Liu, Xuebo Liu, Min Zhang, Bo Du, and Dacheng Tao. 2023. Revisiting Token Dropping Strategy in Efficient BERT Pretraining. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 10391–10405, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Revisiting Token Dropping Strategy in Efficient BERT Pretraining (Zhong et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-long.579.pdf
Video:
 https://aclanthology.org/2023.acl-long.579.mp4