2016
DOI: 10.1017/s0890060416000160
|View full text |Cite
|
Sign up to set email alerts
|

A dynamic semisupervised feedforward neural network clustering

Abstract: An efficient single-layer dynamic semisupervised feedforward neural network clustering method with one epoch training, data dimensionality reduction, and controlling noise data abilities is discussed to overcome the problems of high training time, low accuracy, and high memory complexity of clustering. Dynamically after the entrance of each new online input datum, the code book of nonrandom weights and other important information about online data as essentially important information are updated and stored in … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
14
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(14 citation statements)
references
References 50 publications
0
14
0
Order By: Relevance
“…Because of these dynamic and nonstationary data, our system must use incremental learning methods. Incremental learning means learning by repeating the process of adding or deleting nodes without destroying outdated knowledge and patterns (Asadi, 2016). Online continuous data has multi-dimensions and noisy elements (Asadi, 2016).…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…Because of these dynamic and nonstationary data, our system must use incremental learning methods. Incremental learning means learning by repeating the process of adding or deleting nodes without destroying outdated knowledge and patterns (Asadi, 2016). Online continuous data has multi-dimensions and noisy elements (Asadi, 2016).…”
Section: Methodsmentioning
confidence: 99%
“…So it can process both data and knowledge (Asadi, 2016). And not destroying old knowledge (Asadi, 2016). ODUFFNN stands for an online dynamic unsupervised feedforward neural network (Asadi, 2016).…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…The review and investigation of current UFFNN clustering methods shows some sources of the mentioned problems that must be considered and solved (Asadi et al, 2013): Using random weights, thresholds, and parameters for controlling clustering tasks, initialization of the weights randomly results in the paradox of low accuracy and high training time. The clustering process is considerably slow because weights have to be updated in each epoch during learning.…”
Section: Introductionmentioning
confidence: 99%