A Simple but Effective Pluggable Entity Lookup Table for Pre-trained Language Models

Deming Ye, Yankai Lin, Peng Li, Maosong Sun, Zhiyuan Liu


Abstract
Pre-trained language models (PLMs) cannot well recall rich factual knowledge of entities exhibited in large-scale corpora, especially those rare entities. In this paper, we propose to build a simple but effective Pluggable Entity Lookup Table (PELT) on demand by aggregating the entity’s output representations of multiple occurrences in the corpora. PELT can be compatibly plugged as inputs to infuse supplemental entity knowledge into PLMs. Compared to previous knowledge-enhanced PLMs, PELT only requires 0.2%-5% pre-computation with capability of acquiring knowledge from out-of-domain corpora for domain adaptation scenario. The experiments on knowledge-related tasks demonstrate that our method, PELT, can flexibly and effectively transfer entity knowledge from related corpora into PLMs with different architectures. Our code and models are publicly available at https://github.com/thunlp/PELT
Anthology ID:
2022.acl-short.57
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
523–529
Language:
URL:
https://aclanthology.org/2022.acl-short.57
DOI:
10.18653/v1/2022.acl-short.57
Bibkey:
Cite (ACL):
Deming Ye, Yankai Lin, Peng Li, Maosong Sun, and Zhiyuan Liu. 2022. A Simple but Effective Pluggable Entity Lookup Table for Pre-trained Language Models. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 523–529, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
A Simple but Effective Pluggable Entity Lookup Table for Pre-trained Language Models (Ye et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-short.57.pdf
Software:
 2022.acl-short.57.software.zip
Code
 thunlp/pelt
Data
FewRelFewRel 2.0LAMAS2ORCT-REx