2018 IEEE Second International Conference on Data Stream Mining &Amp; Processing (DSMP) 2018
DOI: 10.1109/dsmp.2018.8478616
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive Kernel Data Streams Clustering Based on Neural Networks Ensembles in Conditions of Uncertainty About Amount and Shapes of Clusters

Abstract: The neural network's approach for data stream clustering task, that in online mode are fed to processing in assumption of uncertainty about amount and shapes of clusters, is proposed in the paper. The main idea of this approach is based on the kernel clustering and idea of neural networks ensembles, that consist of the T. Kohonen's selforganizing maps. Each of the clustering neural networks consists of different number of neurons, where number of clusters is connected with the quality of these neurons. All ens… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0
1

Year Published

2019
2019
2020
2020

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 15 publications
0
4
0
1
Order By: Relevance
“…The last expression of (11) for calculating centroids of clusters can be rewritten in recurrent form. which essentially coincides with the self-learning WTMrule by T. Kohonen [5], where the factor Thus, the process of high-dimensional data clustering (11), (12) is conveniently implemented using the architecture shown in fig. 4 which is a modification of the neuro-fuzzy network of T. Kohonen [26,27].…”
Section: Online Data Compression For Reduction Of Initial Feature Spacementioning
confidence: 87%
See 3 more Smart Citations
“…The last expression of (11) for calculating centroids of clusters can be rewritten in recurrent form. which essentially coincides with the self-learning WTMrule by T. Kohonen [5], where the factor Thus, the process of high-dimensional data clustering (11), (12) is conveniently implemented using the architecture shown in fig. 4 which is a modification of the neuro-fuzzy network of T. Kohonen [26,27].…”
Section: Online Data Compression For Reduction Of Initial Feature Spacementioning
confidence: 87%
“…4 which is a modification of the neuro-fuzzy network of T. Kohonen [26,27]. Here, the first hidden kernel layer (KL) is essentially a standard SOM neural network [5], which contains mneurons in the Kohonen layer, whose synaptic weightscentroids are tuned using the WTM learning rule (12), in the second hidden layer ML, the membership levels of kth observation to j-th cluster   j uk using the first relation (11) are estemated, and in the output layer WL weights ji  are calculated using the second relation of (11).…”
Section: Online Data Compression For Reduction Of Initial Feature Spacementioning
confidence: 99%
See 2 more Smart Citations
“…Штучні нейронні мережі успішно зарекомендували себе у розв'язанні цієї задачі. Вони автоматично будують високоточні моделі, які можуть аналізувати великі та складноструктуровані дані, але незважаючи на велику кількість успішних застосувань, ці методи мають низку недоліків (Bodyanskiy, Tyshchenko & Kopaliani, 2018;Zhernova et al, 2018).…”
Section: інформація про авторівunclassified