Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL) 2019
DOI: 10.18653/v1/k19-1051
|View full text |Cite
|
Sign up to set email alerts
|

KnowSemLM: A Knowledge Infused Semantic Language Model

Abstract: Story understanding requires developing expectations of what events come next in text. Prior knowledge-both statistical and declarative-is essential in guiding such expectations. While existing semantic language models (SemLM) capture event co-occurrence information by modeling event sequences as semantic frames, entities, and other semantic units, this paper aims at augmenting them with causal knowledge (i.e., one event is likely to lead to another). Such knowledge is modeled at the frame and entity level, an… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 13 publications
(5 citation statements)
references
References 52 publications
0
5
0
Order By: Relevance
“…4 Related Work 4.1 Schema Induction Chambers andJurafsky (2008, 2009) automatically learned a schema from newswire text based on coreference and statistical probability models. Peng and Roth (2016); Peng et al (2019) generated an event schema based on their proposed semantic language model., representing the whole schema as a linear sequence of abstract VerbNet (Schuler, 2005) verb senses. In these works, the schema was created for a single actor (protagonist).…”
Section: Metricmentioning
confidence: 99%
“…4 Related Work 4.1 Schema Induction Chambers andJurafsky (2008, 2009) automatically learned a schema from newswire text based on coreference and statistical probability models. Peng and Roth (2016); Peng et al (2019) generated an event schema based on their proposed semantic language model., representing the whole schema as a linear sequence of abstract VerbNet (Schuler, 2005) verb senses. In these works, the schema was created for a single actor (protagonist).…”
Section: Metricmentioning
confidence: 99%
“…Knowledge infusion is a common strategy [14,15] to handle low-resource learning tasks with limited supervision. In RE, knowledge infusion has also been found effective.…”
Section: Knowledge Infusion In Rementioning
confidence: 99%
“…Narrative Event Schema Induction. Previous work (Chambers and Jurafsky, 2008Jans et al, 2012;Balasubramanian et al, 2013;Mooney, 2014, 2016;Rudinger et al, 2015;Granroth-Wilding and Clark, 2016;Modi, 2016;Mostafazadeh et al, 2016a;Peng et al, 2019) focuses on inducing narrative schemas as partially ordered sets of events (represented as verbs) sharing a common argument. The event order is further extended to include causality (Mostafazadeh et al, 2016b;Kalm et al, 2019), and temporal script graph is proposed where events and arguments are abstracted as event types and participant types (Modi et al, 2017;Wanzare et al, 2017;Zhai et al, 2019).…”
Section: Related Workmentioning
confidence: 99%