1996
DOI: 10.1016/0377-2217(96)00033-1
|View full text |Cite
|
Sign up to set email alerts
|

Heuristic and optimization approaches to extending the Kohonen self organizing algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

1998
1998
2012
2012

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 25 publications
(12 citation statements)
references
References 40 publications
0
12
0
Order By: Relevance
“…In this study, clustering techniques involve two distinct works: (1) the determination of the number of clusters present in the reference-base; and (2) the assignment of reference users to one cluster. The number of clusters, which is the number of nodes in the output layer, depends on the expected number of clusters, but there is currently no apparent practical or theoretical way of determining the optimal size of the output layer (Nour & Madey, 1996). There is a possible instability due to the randomness of clusters, so it requires a policy for initial cluster selection.…”
Section: Som Cluster-indexing Cbr Cf Modelmentioning
confidence: 99%
“…In this study, clustering techniques involve two distinct works: (1) the determination of the number of clusters present in the reference-base; and (2) the assignment of reference users to one cluster. The number of clusters, which is the number of nodes in the output layer, depends on the expected number of clusters, but there is currently no apparent practical or theoretical way of determining the optimal size of the output layer (Nour & Madey, 1996). There is a possible instability due to the randomness of clusters, so it requires a policy for initial cluster selection.…”
Section: Som Cluster-indexing Cbr Cf Modelmentioning
confidence: 99%
“…The learning rate influences both the speed of the algorithm and its convergence. If that coefficient value is too large or too small, it could unprofitably affect the final working effect, thus its selection is so important [24]. In spite of the fact that much research has been conducted to optimize the choice of the suitable learning rate, it is still done in the experimental way.…”
Section: Second Som Architecture and Parameters Selectionmentioning
confidence: 99%
“…Thanks to its simple applicability, KSOM has been widely applied in optimization problems, robotics and control, function approximation, estimation and evaluation [28]. The size of the neuron can be changed depending on the problem concerned.…”
Section: Kohonen Sommentioning
confidence: 99%
“…However, the initial connection weight greatly affects the convergence of the network and the mapping direction [28]. In addition, in an optimization problem, to get the optimal solution for the object function, it needs to search a broad area.…”
Section: Mutation Operatormentioning
confidence: 99%