1991
DOI: 10.1080/09540099108946586
|View full text |Cite
|
Sign up to set email alerts
|

A Network for Encoding, Decoding and Translating Locative Prepositions

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

1993
1993
2005
2005

Publication Types

Select...
6
4

Relationship

0
10

Authors

Journals

citations
Cited by 16 publications
(8 citation statements)
references
References 3 publications
0
8
0
Order By: Relevance
“…On the one hand, their work represents, beyond a doubt, the single largest-scale effort in this direction to date. In marked contrast with related connectionist work (Cosmic & Munro, 1988;Harris, 1989;Munro et al, 1991;Regier, 1992), which tends to focus closely on specific aspects of the perceptually grounded language acquisition task, Nenov and Dyer essentially tackle the problem as a whole. The resulting system, DETE, is quite remarkable for its breadth of coverage, encompassing syntactic acquisition, spatial concept learning, binding via phase synchrony, generalization at the morphological level for gender agreement, and the use of a novel and apparently powerful connectionist sequential learning mechanism, Katamic memory.…”
Section: Introductionmentioning
confidence: 88%
“…On the one hand, their work represents, beyond a doubt, the single largest-scale effort in this direction to date. In marked contrast with related connectionist work (Cosmic & Munro, 1988;Harris, 1989;Munro et al, 1991;Regier, 1992), which tends to focus closely on specific aspects of the perceptually grounded language acquisition task, Nenov and Dyer essentially tackle the problem as a whole. The resulting system, DETE, is quite remarkable for its breadth of coverage, encompassing syntactic acquisition, spatial concept learning, binding via phase synchrony, generalization at the morphological level for gender agreement, and the use of a novel and apparently powerful connectionist sequential learning mechanism, Katamic memory.…”
Section: Introductionmentioning
confidence: 88%
“…The purpose of the encoder layer is two-fold: to force the development of distributed representations for each van riable, with each unit in the encoder . bank responding to a certain distribution in the input variable [16], and to handle missing observations [14]. In the event of a missing datum, the corresponding encoder gives a zero response to the hidden layer; this way the encoders do not map missing values in the same way they do available data, using what we consider to be a more neutral representation instead [14].…”
Section: Case Materials and Methodsmentioning
confidence: 99%
“…John and McClelland (1990), the middle among three hidden layers encodes a sentence gestalt that is fed back to the input layer according to the same mechanism in ElmanÏs SRN. In Munro et al (1991), one perceptron-like network encoding a sentence in English feeds another network with a similar structure that performs the translation in German.…”
Section: Connectionist Knowledge Representationmentioning
confidence: 99%