1997
DOI: 10.1016/s0893-6080(97)00020-8
|View full text |Cite
|
Sign up to set email alerts
|

Linsker-type Hebbian Learning: A Qualitative Analysis on the Parameter Space

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

1997
1997
2010
2010

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 45 publications
0
3
0
Order By: Relevance
“…It seems that the power of extreme value theory in neural computation has not yet been fully realized. In previous papers [8][9][10]7] we developed a theory related to extreme values in other classes of neurodynamics. In [6], we applied extreme value theory to some challenging problems in neural computation, such as the capacity of the Hopfield model, learning curves of perceptron learning etc.…”
Section: Discussionmentioning
confidence: 99%
“…It seems that the power of extreme value theory in neural computation has not yet been fully realized. In previous papers [8][9][10]7] we developed a theory related to extreme values in other classes of neurodynamics. In [6], we applied extreme value theory to some challenging problems in neural computation, such as the capacity of the Hopfield model, learning curves of perceptron learning etc.…”
Section: Discussionmentioning
confidence: 99%
“…The effect on these coefficients is such that if the magnitude of CI is large, then it gives a significant head start to the all-excitatory (all-inhibitory) profiles. Feng, Pan, and Roychowdhury (1995) analyzed how this head start would affect the outcome of the evolution of equation 2.1 if the synaptic weights were constrained to lie between fixed minimum and maximum values. These authors reached the conclusion that the stable fixed-point solutions to the equation would lie in four domains where receptive-field profiles are (1) all excitatory, (2) all inhibitory, (3) all excitatory or all inhibitory, or (4) of various shapes, including anisotropic ones.…”
Section: ~mentioning
confidence: 99%
“…While most attention has been given in theoretical analysis to the emergence of center-surrounded or oriented receptive fields [6], [7], little is known about the underlying nonlinear dynamics of this multi-layer neural network. In this paper, we formulate this network as a coupled nonlinear differential system operating at different time-scales under vanishing perturbations.…”
Section: Introductionmentioning
confidence: 99%