2014
DOI: 10.1007/978-3-319-07176-3_67
|View full text |Cite
|
Sign up to set email alerts
|

Asymmetric k-means Clustering of the Asymmetric Self-Organizing Map

Abstract: Abstract. In this paper, an asymmetric approach to clustering of the asymmetric Self-Organizing Map (SOM) is proposed. The clustering is performed using an improved asymmetric version of the well-known kmeans algorithm. The improved asymmetric k-means algorithm is the second proposal of this paper. As a result, we obtain the two-stage fullyasymmetric data analysis technique. In this way, we maintain the structural consistency of the both utilized methods, because they are both formulated in asymmetric version,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2015
2015
2015
2015

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 17 publications
0
2
0
Order By: Relevance
“…Concerning the simpli ication of the kNN.avg1 algorithm which we have considered, our experiments seem to show statistically signi icant reduction of the quality of the classi ication due to its use for most of the parameters settings, i.e., when the pairs of algorithms (4,6), (10,11), (14,15) and (18,19) in Table 6 are compared. However, for the setting where both techniques perform the best, i.e., when the parameter k is dynamically tuned (pair (3,5)), there is no statistically signi icant difference between the original kNN.avg1 technique and its simpli ied version.…”
Section: Tab 6 the Averaged Results Of 200 Runs Of The Compared Algmentioning
confidence: 87%
See 1 more Smart Citation
“…Concerning the simpli ication of the kNN.avg1 algorithm which we have considered, our experiments seem to show statistically signi icant reduction of the quality of the classi ication due to its use for most of the parameters settings, i.e., when the pairs of algorithms (4,6), (10,11), (14,15) and (18,19) in Table 6 are compared. However, for the setting where both techniques perform the best, i.e., when the parameter k is dynamically tuned (pair (3,5)), there is no statistically signi icant difference between the original kNN.avg1 technique and its simpli ied version.…”
Section: Tab 6 the Averaged Results Of 200 Runs Of The Compared Algmentioning
confidence: 87%
“…Then we propose to model the cases and proceed with the classi ication of the documents in the frame-work of the hidden Markov models and sequence mining [22], using the concepts of the computational intelligence [23], or employing the support vector machines [24]. We also pursued other paths, including semantic representation of documents, inding a parallel of the MTC with text segmentation, studying the asymmetry of similarity [13,14], devising new cluster analysis techniques [11] or investigating the applicability of the concepts related to the coreference detection in data schemas [17].…”
Section: Related Workmentioning
confidence: 99%