1979
DOI: 10.1207/s15516709cog0302_3
|View full text |Cite
|
Sign up to set email alerts
|

Inferring Conceptual Graphs*

Abstract: This paper investigates the mechanisms a program may use to learn conceptual structures that represent natural language meaning. A computer program named Moran is described that infers conceptual structures from pictorial input data. Moran is presented with “snapshots” of an environment and an English sentence describing the action that takes place between the snapshots. The learning task is to associate each root verb with a conceptual structure that represents the types of objects that participate in the act… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

1982
1982
1996
1996

Publication Types

Select...
2
2
2

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 10 publications
0
3
0
Order By: Relevance
“…While most prior computational work on meaning acquisition focuses on contextual learning by scanning texts, some notable work has pursued a path similax to that described here attempting to learn from correlated linguistic and non-linguistic input. In [16,17], Salveter describes a system called MORAN. The non-linguistic component of each scenario presented to MORAN consists of a sequence of exactly two scenes, where each scene is described by a conjunction of atomic formula.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…While most prior computational work on meaning acquisition focuses on contextual learning by scanning texts, some notable work has pursued a path similax to that described here attempting to learn from correlated linguistic and non-linguistic input. In [16,17], Salveter describes a system called MORAN. The non-linguistic component of each scenario presented to MORAN consists of a sequence of exactly two scenes, where each scene is described by a conjunction of atomic formula.…”
Section: Related Workmentioning
confidence: 99%
“…Several natural language systems have been reported which learn the meanings of new words [5,7,1,16,17,13,14]. Many of these systems (in particular [5,7,1]) learn the new meanings based upon expectations arising from the morphological, syntactic, se-*Supported by an AT&T Bell Laboratories Ph.D. scholarship.…”
Section: Introductionmentioning
confidence: 99%
“…The first cognitive models to explore word learning come from Siskind's (1990;1996) work. Although some other researchers had already computationally explored the acquisition of word meaning (salveter, 1979;pustejovsky, 1988pustejovsky, ), Siskind's (1996 model was the first to implement a cross-situational strategy designed specifically to solve referential uncertainty in a cognitively plausible way. Many of the models which followed echo in some way the decisions made by this early work, thus it is invaluable to spend some time to understand its inner workings, its assumptions about lexical acquisition, what data was used and its results.…”
Section: Siskind's Early Workmentioning
confidence: 99%