DACT-BERT: Differentiable Adaptive Computation Time for an Efficient BERT Inference

Cristobal Eyzaguirre, Felipe del Rio, Vladimir Araujo, Alvaro Soto


Abstract
Large-scale pre-trained language models have shown remarkable results in diverse NLP applications. However, these performance gains have been accompanied by a significant increase in computation time and model size, stressing the need to develop new or complementary strategies to increase the efficiency of these models. This paper proposes DACT-BERT, a differentiable adaptive computation time strategy for BERT-like models. DACT-BERT adds an adaptive computational mechanism to BERT’s regular processing pipeline, which controls the number of Transformer blocks that need to be executed at inference time. By doing this, the model learns to combine the most appropriate intermediate representations for the task at hand. Our experiments demonstrate that our approach, when compared to the baselines, excels on a reduced computational regime and is competitive in other less restrictive ones. Code available at https://github.com/ceyzaguirre4/dact_bert.
Anthology ID:
2022.nlppower-1.10
Volume:
Proceedings of NLP Power! The First Workshop on Efficient Benchmarking in NLP
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Tatiana Shavrina, Vladislav Mikhailov, Valentin Malykh, Ekaterina Artemova, Oleg Serikov, Vitaly Protasov
Venue:
nlppower
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
93–99
Language:
URL:
https://aclanthology.org/2022.nlppower-1.10
DOI:
10.18653/v1/2022.nlppower-1.10
Bibkey:
Cite (ACL):
Cristobal Eyzaguirre, Felipe del Rio, Vladimir Araujo, and Alvaro Soto. 2022. DACT-BERT: Differentiable Adaptive Computation Time for an Efficient BERT Inference. In Proceedings of NLP Power! The First Workshop on Efficient Benchmarking in NLP, pages 93–99, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
DACT-BERT: Differentiable Adaptive Computation Time for an Efficient BERT Inference (Eyzaguirre et al., nlppower 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.nlppower-1.10.pdf
Video:
 https://aclanthology.org/2022.nlppower-1.10.mp4
Data
GLUE