KIMERA: Injecting Domain Knowledge into Vacant Transformer Heads

Benjamin Winter, Alexei Figueroa Rosero, Alexander Löser, Felix Alexander Gers, Amy Siu


Abstract
Training transformer language models requires vast amounts of text and computational resources. This drastically limits the usage of these models in niche domains for which they are not optimized, or where domain-specific training data is scarce. We focus here on the clinical domain because of its limited access to training data in common tasks, while structured ontological data is often readily available. Recent observations in model compression of transformer models show optimization potential in improving the representation capacity of attention heads. We propose KIMERA (Knowledge Injection via Mask Enforced Retraining of Attention) for detecting, retraining and instilling attention heads with complementary structured domain knowledge. Our novel multi-task training scheme effectively identifies and targets individual attention heads that are least useful for a given downstream task and optimizes their representation with information from structured data. KIMERA generalizes well, thereby building the basis for an efficient fine-tuning. KIMERA achieves significant performance boosts on seven datasets in the medical domain in Information Retrieval and Clinical Outcome Prediction settings. We apply KIMERA to BERT-base to evaluate the extent of the domain transfer and also improve on the already strong results of BioBERT in the clinical domain.
Anthology ID:
2022.lrec-1.38
Volume:
Proceedings of the Thirteenth Language Resources and Evaluation Conference
Month:
June
Year:
2022
Address:
Marseille, France
Editors:
Nicoletta Calzolari, Frédéric Béchet, Philippe Blache, Khalid Choukri, Christopher Cieri, Thierry Declerck, Sara Goggi, Hitoshi Isahara, Bente Maegaard, Joseph Mariani, Hélène Mazo, Jan Odijk, Stelios Piperidis
Venue:
LREC
SIG:
Publisher:
European Language Resources Association
Note:
Pages:
363–373
Language:
URL:
https://aclanthology.org/2022.lrec-1.38
DOI:
Bibkey:
Cite (ACL):
Benjamin Winter, Alexei Figueroa Rosero, Alexander Löser, Felix Alexander Gers, and Amy Siu. 2022. KIMERA: Injecting Domain Knowledge into Vacant Transformer Heads. In Proceedings of the Thirteenth Language Resources and Evaluation Conference, pages 363–373, Marseille, France. European Language Resources Association.
Cite (Informal):
KIMERA: Injecting Domain Knowledge into Vacant Transformer Heads (Winter et al., LREC 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.lrec-1.38.pdf
Data
MIMIC-IIIMedQuAD