Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing Volume 2 - EMNLP '09 2009
DOI: 10.3115/1699571.1699596
|View full text |Cite
|
Sign up to set email alerts
|

Improving verb clustering with automatically acquired selectional preferences

Abstract: In previous research in automatic verb classification, syntactic features have proved the most useful features, although manual classifications rely heavily on semantic features. We show, in contrast with previous work, that considerable additional improvement can be obtained by using semantic features in automatic classification: verb selectional preferences acquired from corpus data using a fully unsupervised method. We report these promising results using a new framework for verb clustering which incorporat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
73
0
3

Year Published

2011
2011
2018
2018

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 52 publications
(76 citation statements)
references
References 19 publications
0
73
0
3
Order By: Relevance
“…For example, if cover occurs frequently in the LT structure (e.g., I covered the table with paper) and never in the TL structure (e.g., *I covered paper onto the table), then cover is assigned to a verb class that only allows the LT structure. Another approach to distributional learning is to assign syntax-relevant semantics using the words that co-occur with the verb (Dorr & Jones, 1996;Dumais & Landauer, 1997;Joanis, Stevenson & James, 2008;Resnik, 1996;Rohde, Gonnerman & Plaut, 2006;Riordan & Jones, 2011;Sun and Korhonen, 2009;Redington et al, 1998). For example, if a child hears the utterance He is sloshing paint around, the child classifies slosh with other verbs that take paint as an object (e.g., the man poured paint into the bucket; the girl spilled paint on the table), creating a verb class based on word distributional similarities.…”
Section: )mentioning
confidence: 99%
See 1 more Smart Citation
“…For example, if cover occurs frequently in the LT structure (e.g., I covered the table with paper) and never in the TL structure (e.g., *I covered paper onto the table), then cover is assigned to a verb class that only allows the LT structure. Another approach to distributional learning is to assign syntax-relevant semantics using the words that co-occur with the verb (Dorr & Jones, 1996;Dumais & Landauer, 1997;Joanis, Stevenson & James, 2008;Resnik, 1996;Rohde, Gonnerman & Plaut, 2006;Riordan & Jones, 2011;Sun and Korhonen, 2009;Redington et al, 1998). For example, if a child hears the utterance He is sloshing paint around, the child classifies slosh with other verbs that take paint as an object (e.g., the man poured paint into the bucket; the girl spilled paint on the table), creating a verb class based on word distributional similarities.…”
Section: )mentioning
confidence: 99%
“…First, to ensure we were classifying locative verbs, we selected target verbs from the 140 verbs which had an LT rating (see Section 2.1) and which occurred in our handcoded data at least 25 times in a locative structure (Sun & Korhonen, 2009). This is important because many locative verbs also occur in non-locative forms.…”
Section: A Corpus-based Test Of the Distributional Learning Hypothesismentioning
confidence: 99%
“…[24][25][26]. Interestingly, the recent experiment performed by Sun et al demonstrates that it is possible to take an unsupervised clustering method developed for English [27] and apply it successfully to French [25], using French NLP tools for feature extraction, but without language specific feature engineering. If this approach was applicable to a wider range of languages, it could greatly support the development of VerbNets across languages and language domains.…”
Section: Introductionmentioning
confidence: 99%
“…Portuguese and apply the state-of-the-art verb clustering approach developed for English [27] to this language. Using the NLP tools developed for Br.…”
Section: Introductionmentioning
confidence: 99%
“…In any case, their role is small, given the small ratio of subcategorization frames shared in general. This diverges with most of the work done in automatic verb classification, which relies on subcategorization frames to create verb classifications that contain semantically coherent classes, following Levin's insight (Schulte im Walde 2006;Sun and Korhonen 2009). In contrast to this, we see that the pairs of senses that are semantically similar according to the Adesse and WordNet semantic fields share up to 36% of the subcategorization frames at most.…”
Section: Comparison Of Both Approachesmentioning
confidence: 53%