Simultaneously Self-Attending to Text and Entities for Knowledge-Informed Text Representations

Dung Thai, Raghuveer Thirukovalluru, Trapit Bansal, Andrew McCallum


Abstract
Pre-trained language models have emerged as highly successful methods for learning good text representations. However, the amount of structured knowledge retained in such models, and how (if at all) it can be extracted, remains an open question. In this work, we aim at directly learning text representations which leverage structured knowledge about entities mentioned in the text. This can be particularly beneficial for downstream tasks which are knowledge-intensive. Our approach utilizes self-attention between words in the text and knowledge graph (KG) entities mentioned in the text. While existing methods require entity-linked data for pre-training, we train using a mention-span masking objective and a candidate ranking objective – which doesn’t require any entity-links and only assumes access to an alias table for retrieving candidates, enabling large-scale pre-training. We show that the proposed model learns knowledge-informed text representations that yield improvements on the downstream tasks over existing methods.
Anthology ID:
2021.repl4nlp-1.25
Volume:
Proceedings of the 6th Workshop on Representation Learning for NLP (RepL4NLP-2021)
Month:
August
Year:
2021
Address:
Online
Editors:
Anna Rogers, Iacer Calixto, Ivan Vulić, Naomi Saphra, Nora Kassner, Oana-Maria Camburu, Trapit Bansal, Vered Shwartz
Venue:
RepL4NLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
241–247
Language:
URL:
https://aclanthology.org/2021.repl4nlp-1.25
DOI:
10.18653/v1/2021.repl4nlp-1.25
Bibkey:
Cite (ACL):
Dung Thai, Raghuveer Thirukovalluru, Trapit Bansal, and Andrew McCallum. 2021. Simultaneously Self-Attending to Text and Entities for Knowledge-Informed Text Representations. In Proceedings of the 6th Workshop on Representation Learning for NLP (RepL4NLP-2021), pages 241–247, Online. Association for Computational Linguistics.
Cite (Informal):
Simultaneously Self-Attending to Text and Entities for Knowledge-Informed Text Representations (Thai et al., RepL4NLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.repl4nlp-1.25.pdf
Data
Open Entity