2016
DOI: 10.1109/tnnls.2016.2570124
|View full text |Cite
|
Sign up to set email alerts
|

The Growing Hierarchical Neural Gas Self-Organizing Neural Network

Abstract: The growing neural gas (GNG) self-organizing neural network stands as one of the most successful examples of unsupervised learning of a graph of processing units. Despite its success, little attention has been devoted to its extension to a hierarchical model, unlike other models such as the self-organizing map, which has many hierarchical versions. Here, a hierarchical GNG is presented, which is designed to learn a tree of graphs. Moreover, the original GNG algorithm is improved by a distinction between a grow… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0
1

Year Published

2017
2017
2021
2021

Publication Types

Select...
5
2
1

Relationship

3
5

Authors

Journals

citations
Cited by 25 publications
(23 citation statements)
references
References 40 publications
0
22
0
1
Order By: Relevance
“…However, the mechanism of the insertion of new neurons into the neural gas algorithm, based on setting the insertion period, often leads to distortion of the formed structures and instability of the learning process. However, it was shown in article [16] that it is possible to ensure stability of learning by setting the "ac-cessibility radius" of neurons. It involves the replacement of the neurons insertion period with the threshold of a maximum distance of a neuron from every point of the learning dataset, referred to it.…”
Section: Literature Review and Problem Statementmentioning
confidence: 99%
See 1 more Smart Citation
“…However, the mechanism of the insertion of new neurons into the neural gas algorithm, based on setting the insertion period, often leads to distortion of the formed structures and instability of the learning process. However, it was shown in article [16] that it is possible to ensure stability of learning by setting the "ac-cessibility radius" of neurons. It involves the replacement of the neurons insertion period with the threshold of a maximum distance of a neuron from every point of the learning dataset, referred to it.…”
Section: Literature Review and Problem Statementmentioning
confidence: 99%
“…An analysis of the pseudocode in Fig. 2 shows that current solution s current , in relation to which new best solutions s best are sought for, is updated in case of providing a new solution of the criterion increase (2), or randomly from the Gibbs distribution [16]. In this case, an initial search point that is formed by the create_initial_solution procedure can be either randomly generated or a result of the preliminarily where uniform_random is the function of generation of a random number from the uniformed distribution from the assigned range; step_size is the size of a range of the search for new solutions, neighboring to s current .…”
Section: Fig 2 Pseudocode Of Metaheuristic Simulated Annealing Algomentioning
confidence: 99%
“…A Growing Hierarchical Bregman Neural Gas (GHBNG) network is defined as a Growing Hierarchical Neural Gas (GHNG) network [5] in which Bregman divergences are incorporated in order to compute the winning neuron. A GHBNG network can be seen as a tree of Growing Neural Gas (GNG) networks [3] where a mechanism to control the growth of each GNG graph is established.…”
Section: The Ghbng Modelmentioning
confidence: 99%
“…These self-organizing models have their hierarchical versions, such as the Growing Hierarchical Self-Organizing Map (GHSOM) for the SOM [4] and the Growing Hierarchical Neural Gas (GHNG) for the GNG [5], in which a neuron can be expanded into a new map or graph in a subsequent layer of the hierarchy depending on the quantization error associated to that neuron or the graph it belongs to. Hierarchical models can reflect hierarchical relations present among input data in a more straightforward way.…”
Section: Introductionmentioning
confidence: 99%
“…The original GNG model was proposed by Fritzke [9] and has become a standard of applications in computer vision [10] and robotics [11], as well as other self-organizing models for foreground detection [12] or object tracking [13] in video sequences.…”
Section: Introductionmentioning
confidence: 99%