GiBERT: Enhancing BERT with Linguistic Information using a Lightweight Gated Injection Method

Nicole Peinelt, Marek Rei, Maria Liakata


Abstract
Large pre-trained language models such as BERT have been the driving force behind recent improvements across many NLP tasks. However, BERT is only trained to predict missing words – either through masking or next sentence prediction – and has no knowledge of lexical, syntactic or semantic information beyond what it picks up through unsupervised pre-training. We propose a novel method to explicitly inject linguistic information in the form of word embeddings into any layer of a pre-trained BERT. When injecting counter-fitted and dependency-based embeddings, the performance improvements on multiple semantic similarity datasets indicate that such information is beneficial and currently missing from the original model. Our qualitative analysis shows that counter-fitted embedding injection is particularly beneficial, with notable improvements on examples that require synonym resolution.
Anthology ID:
2021.findings-emnlp.200
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
2322–2336
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.200
DOI:
10.18653/v1/2021.findings-emnlp.200
Bibkey:
Cite (ACL):
Nicole Peinelt, Marek Rei, and Maria Liakata. 2021. GiBERT: Enhancing BERT with Linguistic Information using a Lightweight Gated Injection Method. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 2322–2336, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
GiBERT: Enhancing BERT with Linguistic Information using a Lightweight Gated Injection Method (Peinelt et al., Findings 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.findings-emnlp.200.pdf
Video:
 https://aclanthology.org/2021.findings-emnlp.200.mp4
Code
 wuningxi/gibert