A Transformational Biencoder with In-Domain Negative Sampling for Zero-Shot Entity Linking

Kai Sun, Richong Zhang, Samuel Mensah, Yongyi Mao, Xudong Liu


Abstract
Recent interest in entity linking has focused in the zero-shot scenario, where at test time the entity mention to be labelled is never seen during training, or may belong to a different domain from the source domain. Current work leverage pre-trained BERT with the implicit assumption that it bridges the gap between the source and target domain distributions. However, fine-tuned BERT has a considerable underperformance at zero-shot when applied in a different domain. We solve this problem by proposing a Transformational Biencoder that incorporates a transformation into BERT to perform a zero-shot transfer from the source domain during training. As like previous work, we rely on negative entities to encourage our model to discriminate the golden entities during training. To generate these negative entities, we propose a simple but effective strategy that takes the domain of the golden entity into perspective. Our experimental results on the benchmark dataset Zeshel show effectiveness of our approach and achieve new state-of-the-art.
Anthology ID:
2022.findings-acl.114
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1449–1458
Language:
URL:
https://aclanthology.org/2022.findings-acl.114
DOI:
10.18653/v1/2022.findings-acl.114
Bibkey:
Cite (ACL):
Kai Sun, Richong Zhang, Samuel Mensah, Yongyi Mao, and Xudong Liu. 2022. A Transformational Biencoder with In-Domain Negative Sampling for Zero-Shot Entity Linking. In Findings of the Association for Computational Linguistics: ACL 2022, pages 1449–1458, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
A Transformational Biencoder with In-Domain Negative Sampling for Zero-Shot Entity Linking (Sun et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-acl.114.pdf
Software:
 2022.findings-acl.114.software.zip
Data
ZESHEL