Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Confere 2015
DOI: 10.3115/v1/p15-1019
|View full text |Cite
|
Sign up to set email alerts
|

Generative Event Schema Induction with Entity Disambiguation

Abstract: This paper presents a generative model to event schema induction. Previous methods in the literature only use head words to represent entities. However, elements other than head words contain useful information. For instance, an armed man is more discriminative than man. Our model takes into account this information and precisely represents it using probabilistic topic distributions. We illustrate that such information plays an important role in parameter estimation. Mostly, it makes topic distributions more c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
64
0

Year Published

2016
2016
2025
2025

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 62 publications
(70 citation statements)
references
References 18 publications
1
64
0
Order By: Relevance
“…Such methods for script-learning also include Bayesian approaches (Bejan, 2008;Frermann et al, 2014), sequence alignment algorithms (Regneri et al, 2010) and neural networks (Modi and Titov, 2014;Granroth-Wilding and Clark, 2016;Pichotta and Mooney, 2016). There has also been work on representing events in a structured manner using schemas, which are learned probabilistically (Chambers, 2013;Cheung et al, 2013;Nguyen et al, 2015), using graphs (Balasubramanian et al, 2013) or neural approaches (Titov and Khoddam, 2015). Recently, Ferraro and Durme (2016) presented a unified Bayesian model for scripts and frames.…”
Section: Events-centered Learningmentioning
confidence: 99%
“…Such methods for script-learning also include Bayesian approaches (Bejan, 2008;Frermann et al, 2014), sequence alignment algorithms (Regneri et al, 2010) and neural networks (Modi and Titov, 2014;Granroth-Wilding and Clark, 2016;Pichotta and Mooney, 2016). There has also been work on representing events in a structured manner using schemas, which are learned probabilistically (Chambers, 2013;Cheung et al, 2013;Nguyen et al, 2015), using graphs (Balasubramanian et al, 2013) or neural approaches (Titov and Khoddam, 2015). Recently, Ferraro and Durme (2016) presented a unified Bayesian model for scripts and frames.…”
Section: Events-centered Learningmentioning
confidence: 99%
“…Thus what is learned is not evaluated for contingency (Chambers and Jurafsky, 2008;Chambers and Jurafsky, 2009;Manshadi et al, 2008;Nguyen et al, 2015;Balasubramanian et al, 2013;Pichotta and Mooney, 2014). Historically, work on scripts explicitly modeled causality (Lehnert, 1981;Mooney and DeJong, 1985) inter alia.…”
Section: Stormmentioning
confidence: 99%
“…Chambers and Jurafsky (2008) first proposed an unsupervised approach to learn partially ordered sets of events from raw text. Many expansions have been introduced later, including unsupervisedly learning narrative schemas and scripts (Chambers and Jurafsky, 2009;Regneri et al, 2011), event schemas and frames (Chambers and Jurafsky, 2011;Balasubramanian et al, 2013;Sha et al, 2016;Huang et al, 2016;Mostafazadeh et al, 2016b), and some generative models to learn latent structures of event knowledge (Cheung et al, 2013;Chambers, 2013;Bamman et al, 2014;Nguyen et al, 2015). Another direction for learning event-centred knowledge is causality identification (Do et al, 2011;Radinsky et al, 2012;Berant et al, 2014;Hashimoto et al, 2015;Gui et al, 2016), which tried to identify the causality relation in text.…”
Section: Related Workmentioning
confidence: 99%