Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challeng 2000
DOI: 10.1109/ijcnn.2000.859366
|View full text |Cite
|
Sign up to set email alerts
|

The growing hierarchical self-organizing map

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
102
0
4

Year Published

2005
2005
2014
2014

Publication Types

Select...
4
2
2

Relationship

1
7

Authors

Journals

citations
Cited by 201 publications
(106 citation statements)
references
References 7 publications
0
102
0
4
Order By: Relevance
“…After that, we removed the most common stopwords, using an English stoplist, and accomplished a stemming phase with the algorithm by Porter (1980). After applying the TF-IDF term weighting function to find the significance of a feature (word) in tag representation (being the IDF factor log N d f (t) , where N is the collection size and d f (t) is the document frequency of a tag t), a dimensionality reduction stage was carried out to reduce the number of features per vector: we removed the terms with high (0.6) and low (0.02) document frequency values (Dittenbach et al (2000)). …”
Section: Representation By Document Contentmentioning
confidence: 99%
See 2 more Smart Citations
“…After that, we removed the most common stopwords, using an English stoplist, and accomplished a stemming phase with the algorithm by Porter (1980). After applying the TF-IDF term weighting function to find the significance of a feature (word) in tag representation (being the IDF factor log N d f (t) , where N is the collection size and d f (t) is the document frequency of a tag t), a dimensionality reduction stage was carried out to reduce the number of features per vector: we removed the terms with high (0.6) and low (0.02) document frequency values (Dittenbach et al (2000)). …”
Section: Representation By Document Contentmentioning
confidence: 99%
“…SOM has proven to be a effective way not only to organize information, but also to visualize it, and even to allow content addressable searches (Vesanto and Alhoniemi, 2000;Dittenbach et al, 2000;Russell et al, 2002;Perelomov et al, 2002;Roh et al, 2003;Jieh-Haur and Chen, 2012;Barrón-Adame et al, 2012). …”
Section: The Self-organizing Map (Som)mentioning
confidence: 99%
See 1 more Smart Citation
“…The GHSOM algorithm is based upon a structure composed of Self-Organizing Maps (SOM) that grow, by increasing the number of nonlinear units, until a quality criterion of the classification has been satisfied (Dittenbach, et al, 2000). The growing process builds a hierarchical structure composed by SOM in which lower layers classify data with more granularity, as shown in Fig.…”
Section: Growing Hierarchical Self Organizing Mapsmentioning
confidence: 99%
“…Some of the extensions of the SOM algorithm and architecture address the disadvantages of fixed size and missing hierarchical representation. One of the SOMbased models implementing an algorithm dealing with both issues is the Growing Hierarchical Self-Organizing Map (GH-SOM) [5], [6], [7]. The GHSOM is a neural network architecture combining the advantages of two principal extension of the Self-Organizing Map, dynamic growth and hierarchical structure.…”
Section: Introductionmentioning
confidence: 99%