2005
DOI: 10.1007/s10994-005-0913-1
|View full text |Cite
|
Sign up to set email alerts
|

Corpus-based Learning of Analogies and Semantic Relations

Abstract: Abstract. We present an algorithm for learning from unlabeled text, based on the Vector Space Model (VSM) of information retrieval, that can solve verbal analogy questions of the kind found in the SAT college entrance exam. A verbal analogy has the form A:B::C:D, meaning "A is to B as C is to D"; for example, mason:stone::carpenter:wood. SAT analogy questions provide a word pair, A:B, and the problem is to select the most analogous word pair, C:D, from a set of five choices. The VSM algorithm correctly answers… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
104
0
5

Year Published

2006
2006
2019
2019

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 120 publications
(110 citation statements)
references
References 41 publications
1
104
0
5
Order By: Relevance
“…Hofstadter (2007) claimed, "all meaning comes from analogies." In NLP, analogical algorithms have been applied to machine translation (Lepage and Denoual, 2005), morphology (Lepage, 1998), and semantic relations (Turney and Littman, 2005). Analogy provides a framework that has the potential to unify the field of semantics.…”
Section: Resultsmentioning
confidence: 99%
“…Hofstadter (2007) claimed, "all meaning comes from analogies." In NLP, analogical algorithms have been applied to machine translation (Lepage and Denoual, 2005), morphology (Lepage, 1998), and semantic relations (Turney and Littman, 2005). Analogy provides a framework that has the potential to unify the field of semantics.…”
Section: Resultsmentioning
confidence: 99%
“…In order to evaluate the proposed approach for analogy generation, we follow the method explained by Turney and Littman (Turney and Littman 2005) for evaluating analogies using a large corpus. In their study, Turney and Littman reported that their method can solve about 47% of multiple-choice analogy questions (compared to an average of 57% correct answers solved by high school students).…”
Section: Corpus-based Evaluationmentioning
confidence: 99%
“…The best performing individual module was based on Vector Space Model (VSM). In the VSM approach to measuring relational similarity [7], first a vector is created for a word-pair (X, Y) by counting the frequencies of various lexical patterns containing X and Y. In their experiments they used 128 manually created patterns such as "X of Y", "Y of X", "X to Y" and "Y to X".…”
Section: Related Workmentioning
confidence: 99%
“…3.1 to extract lexical patterns for 374 SAT multiplechoice analogy questions. This dataset was first proposed by Turney and Littman [7] as a benchmark dataset to evaluate relational similarity measures. Generally, there are six word pairs in each question (i.e.…”
Section: Pattern Selectionmentioning
confidence: 99%
See 1 more Smart Citation