IEEE International Conference on Neural Networks
DOI: 10.1109/icnn.1993.298599
|View full text |Cite
|
Sign up to set email alerts
|

Incremental grid growing: encoding high-dimensional structure into a two-dimensional feature map

Abstract: Knowledge of clusters and their relations is important in understanding highdimensional input data with unknown distribution. Ordinary feature maps with fully connected, Axed grid topology cannot properly reflect the structure of clusters in the input space-there are no cluster boundaries on the map. Incremental feature map algorithms, where nodes and connections are added to or deleted from the map according to the input distribution, can overcome this problem. However, so far such algorithms have been limite… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
48
0

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 85 publications
(48 citation statements)
references
References 8 publications
0
48
0
Order By: Relevance
“…There are many approaches which apply these algorithms in classic neural networks (Islam et al, 2009), (Bortman and Aladjem, 2009), (Han and Qiao, 2013), (Yang and Chen, 2012). Also, there are many variations of SOM that allow a more flexible structure of the output map which can be divided into two categories: In the first type, we include growing grid (GG) , incremental GG (Blackmore and Miikkulainen, 1993), growing SOM (GSOM) (Alahakoon et al, 2000) all coming with different variants. GG is the only variant which allows growing a new node from the interior of the grid (but this is a whole row or column of nodes).…”
Section: Flexible Structure In Neuralmentioning
confidence: 99%
“…There are many approaches which apply these algorithms in classic neural networks (Islam et al, 2009), (Bortman and Aladjem, 2009), (Han and Qiao, 2013), (Yang and Chen, 2012). Also, there are many variations of SOM that allow a more flexible structure of the output map which can be divided into two categories: In the first type, we include growing grid (GG) , incremental GG (Blackmore and Miikkulainen, 1993), growing SOM (GSOM) (Alahakoon et al, 2000) all coming with different variants. GG is the only variant which allows growing a new node from the interior of the grid (but this is a whole row or column of nodes).…”
Section: Flexible Structure In Neuralmentioning
confidence: 99%
“…The initial choice of the number of classes is arbitrary, and there does not exist any method to choose the size of the grid in a perfect way, even if some authors suggest using growing or shrinking approaches (Blackmore and Mikkulainen, 1993, Fritzke, 1994, Koikkalainen and Oja, 1990. To exploit the stochastic features of the SOM algorithm, and to obtain a good clustering and a good organization, it is proven to be more efficient to deal with "large" maps.…”
Section: Example I: the Country Database Pop_96mentioning
confidence: 99%
“…During the on-line stage, the quantization error measured on each input vector controls the growing process. Related works on growing Kohonen's map have been proposed in [11,8]. The allocation process of new recognition codes (codebooks) is controlled by a threshold value that is empirically found.…”
Section: Self Organizing Mapsmentioning
confidence: 99%