2004
DOI: 10.1016/j.neunet.2004.07.004
|View full text |Cite
|
Sign up to set email alerts
|

Early lexical development in a self-organizing neural network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

6
176
1
2

Year Published

2010
2010
2017
2017

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 210 publications
(185 citation statements)
references
References 32 publications
6
176
1
2
Order By: Relevance
“…Several computational models have therefore implemented simple learning mechanisms (such as Hebbian rules) to create associations between patterns of active neurons representing the phonological aspects of words and internal simulations (i.e., representations of object features involved in perception and action, cf. Jeannerod, 2007;Mayor & Plunkett, 2010;Caligiore et al, 2010;Li, Farkas, & MacWhinney, 2004).…”
Section: Introductionmentioning
confidence: 99%
“…Several computational models have therefore implemented simple learning mechanisms (such as Hebbian rules) to create associations between patterns of active neurons representing the phonological aspects of words and internal simulations (i.e., representations of object features involved in perception and action, cf. Jeannerod, 2007;Mayor & Plunkett, 2010;Caligiore et al, 2010;Li, Farkas, & MacWhinney, 2004).…”
Section: Introductionmentioning
confidence: 99%
“…The most common unsupervised algorithm is Hebbian learning, which is often used to train selforganizing maps (e.g., Cohen et al, 2002;Li et al, 2004Li et al, , 2007. Hebbian learning strengthens the weight between two units whose activity is correlated and weakens the weight between two units whose activity is uncorrelated.…”
Section: Issues In Modeling Changes In Experiencementioning
confidence: 99%
“…This is akin to the processes of synaptogenesis and pruning: The former forms some maximum number of synapses in early childhood, and the latter eliminates unnecessary synapses to eventually settle at adult levels (Huttenlocher & Dabholkar, 1997). Another example of the use of self-organizing maps to model neural plasticity can be found in models of language acquisition by Li et al (2004) and Farkas and Li (2002).A final way in which neural plasticity has been modeled is through the manual addition of hidden units in a static network. In contrast to cascade-correlation frameworks, which recruit additional hidden units as training proceeds, this approach involves the use of different model structures at different ages.…”
mentioning
confidence: 99%
See 1 more Smart Citation
“…Note that algorithms for inducing part of speech categories from raw data (i.e., unsupervised POS tagging) abound, both in the cognitive linguistic literature (e.g., Li et al [37] and references therein) and in the computational linguistic literature (e.g., Banko and Moore [5], Smith and Eisner [56]). …”
Section: Computational Grammar Inductionmentioning
confidence: 99%