KnowSemLM: A Knowledge Infused Semantic Language Model

Haoruo Peng, Qiang Ning, Dan Roth


Abstract
Story understanding requires developing expectations of what events come next in text. Prior knowledge – both statistical and declarative – is essential in guiding such expectations. While existing semantic language models (SemLM) capture event co-occurrence information by modeling event sequences as semantic frames, entities, and other semantic units, this paper aims at augmenting them with causal knowledge (i.e., one event is likely to lead to another). Such knowledge is modeled at the frame and entity level, and can be obtained either statistically from text or stated declaratively. The proposed method, KnowSemLM, infuses this knowledge into a semantic LM by joint training and inference, and is shown to be effective on both the event cloze test and story/referent prediction tasks.
Anthology ID:
K19-1051
Volume:
Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL)
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Mohit Bansal, Aline Villavicencio
Venue:
CoNLL
SIG:
SIGNLL
Publisher:
Association for Computational Linguistics
Note:
Pages:
550–562
Language:
URL:
https://aclanthology.org/K19-1051
DOI:
10.18653/v1/K19-1051
Bibkey:
Cite (ACL):
Haoruo Peng, Qiang Ning, and Dan Roth. 2019. KnowSemLM: A Knowledge Infused Semantic Language Model. In Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL), pages 550–562, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
KnowSemLM: A Knowledge Infused Semantic Language Model (Peng et al., CoNLL 2019)
Copy Citation:
PDF:
https://aclanthology.org/K19-1051.pdf