Proceedings of the Thirtieth Hawaii International Conference on System Sciences
DOI: 10.1109/hicss.1997.663212
|View full text |Cite
|
Sign up to set email alerts
|

Improving the effectiveness of self-organizing map networks using a circular Kohonen layer

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
7
0

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 9 publications
(7 citation statements)
references
References 22 publications
0
7
0
Order By: Relevance
“…Weights are updated for winner node j* and all nodes in the neighborhood defined by NE 1 . (t) as shown in Fig 2. New weights as per proposed WNSOM are given by (3) and exponentially varying function is (4) where the range of weighted neighborhood factor is These equations are used to give decreasing update as we move away from winning cell, to cells in the neighborhood. This is obvious because the winning neuron should get the maximum reward as compared to others in the neighborhood.…”
Section: Proposed Wnsom Algorithmmentioning
confidence: 99%
See 1 more Smart Citation
“…Weights are updated for winner node j* and all nodes in the neighborhood defined by NE 1 . (t) as shown in Fig 2. New weights as per proposed WNSOM are given by (3) and exponentially varying function is (4) where the range of weighted neighborhood factor is These equations are used to give decreasing update as we move away from winning cell, to cells in the neighborhood. This is obvious because the winning neuron should get the maximum reward as compared to others in the neighborhood.…”
Section: Proposed Wnsom Algorithmmentioning
confidence: 99%
“…Lo and Bavarian [3 J addressed the effect of neighborhood function selection on tne rate of convergence of the SOM algorithm. Kiang et al [ 4] developed a "circular" training algorithm that tries to overcome some of the ineffective topological representations caused by the "boundary" effect. Fritzke [5] proposed a new selforganizing neural network model that can determine shape as well as size of the network during the simulation in an incremental fashion.…”
mentioning
confidence: 99%
“…Lo (1991) focused on the selection of neighborhood function, and Kiang (2006) proposed a circular training algorithm to overcome the "boundary" effect on topological representations. In another study, an incremental learning algorithm was applied (Jun et al, 1993).…”
Section: Introductionmentioning
confidence: 99%
“…Lo and Bavarian addressed the effect of neighborhood function selection on the rate of convergence of the SOM algorithm [16]. Kiang et aldeveloped a "circular" training algorithm that tries to overcome some of the ineffective topological representations caused by the "boundary" effect [17]. Fritzke proposed a new self-organizing neural network model that can determine shape as well as size of the network during the simulation in an incremental fashion [18].…”
Section: Introductionmentioning
confidence: 99%