2014
DOI: 10.1016/j.neunet.2014.01.005
|View full text |Cite
|
Sign up to set email alerts
|

Growing Neural Gas approach for obtaining homogeneous maps by restricting the insertion of new nodes

Abstract: The Growing Neural Gas model is used widely in artificial neural networks.However, its application is limited in some contexts by the proliferation of nodes in dense areas of the input space. In this study, we introduce some modifications to address this problem by imposing three restrictions on the insertion of new nodes. Each restriction aims to maintain the homogeneous values of selected criteria. One criterion is related to the square error of classification and an alternative approach is proposed for avoi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(2 citation statements)
references
References 19 publications
0
2
0
Order By: Relevance
“…Since then, it has been mostly applied in the context of vector quantization or data compression, such as image processing, speech recognition or pattern recognition. 30,31 A batch variant of NGN which shows much faster convergence and which can be interpreted as an optimization of the cost function by the Newton method was proposed by Cottrell et al 32 Quintana-Pacheco et al 33 proposed a modified version of the NGN algorithm where the main goal consists of avoiding prototype proliferation in dense areas of the input space. Garcı´A-Rodrı´Guez et al 34 addressed the ability of self-organizing neural network models to manage real-time applications by introducing the so-called fAGNG (fast Autonomous Growing Neural Gas), a modified learning algorithm for the incremental model NGN network.…”
Section: Pgngnsmentioning
confidence: 99%
See 1 more Smart Citation
“…Since then, it has been mostly applied in the context of vector quantization or data compression, such as image processing, speech recognition or pattern recognition. 30,31 A batch variant of NGN which shows much faster convergence and which can be interpreted as an optimization of the cost function by the Newton method was proposed by Cottrell et al 32 Quintana-Pacheco et al 33 proposed a modified version of the NGN algorithm where the main goal consists of avoiding prototype proliferation in dense areas of the input space. Garcı´A-Rodrı´Guez et al 34 addressed the ability of self-organizing neural network models to manage real-time applications by introducing the so-called fAGNG (fast Autonomous Growing Neural Gas), a modified learning algorithm for the incremental model NGN network.…”
Section: Pgngnsmentioning
confidence: 99%
“…32 Quintana-Pacheco et al. 33 proposed a modified version of the NGN algorithm where the main goal consists of avoiding prototype proliferation in dense areas of the input space. GarcíA-RodríGuez et al.…”
Section: Pgngnsmentioning
confidence: 99%