“Politeness, you simpleton!” retorted [MASK]: Masked prediction of literary characters

Eric Holgate, Katrin Erk


Abstract
What is the best way to learn embeddings for entities, and what can be learned from them? We consider this question for the case of literary characters. We address the highly challenging task of guessing, from a sentence in the novel, which character is being talked about, and we probe the embeddings to see what information they encode about their literary characters. We find that when continuously trained, entity embeddings do well at the masked entity prediction task, and that they encode considerable information about the traits and characteristics of the entities.
Anthology ID:
2021.iwcs-1.19
Volume:
Proceedings of the 14th International Conference on Computational Semantics (IWCS)
Month:
June
Year:
2021
Address:
Groningen, The Netherlands (online)
Venue:
IWCS
SIG:
SIGSEM
Publisher:
Association for Computational Linguistics
Note:
Pages:
202–211
Language:
URL:
https://aclanthology.org/2021.iwcs-1.19
DOI:
Bibkey:
Cite (ACL):
Eric Holgate and Katrin Erk. 2021. “Politeness, you simpleton!” retorted [MASK]: Masked prediction of literary characters. In Proceedings of the 14th International Conference on Computational Semantics (IWCS), pages 202–211, Groningen, The Netherlands (online). Association for Computational Linguistics.
Cite (Informal):
“Politeness, you simpleton!” retorted [MASK]: Masked prediction of literary characters (Holgate & Erk, IWCS 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.iwcs-1.19.pdf
Attachment:
 2021.iwcs-1.19.Attachment.zip