2018
DOI: 10.14419/ijet.v7i4.11.20790
|View full text |Cite
|
Sign up to set email alerts
|

Using K-Fold Cross Validation Proposed Models for Spikeprop Learning Enhancements

Abstract: Spiking Neural Network (SNN) uses individual spikes in time field to perform as well as to communicate computation in such a way as the actual neurons act. SNN was not studied earlier as it was considered too complicated and too hard to examine. Several limitations concerning the characteristics of SNN which were not researched earlier are now resolved since the introduction of SpikeProp in 2000 by Sander Bothe as a supervised SNN learning model. This paper defines the research developments of the enhancement … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
11
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
2
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 17 publications
(11 citation statements)
references
References 38 publications
0
11
0
Order By: Relevance
“…Random Forests were implemented by repeatedly fitting the model to 1000 resampled subsets of the data (100 repeats of 10-fold cross-validation). For each repetition, the dataset was divided into 10-cross-folds, of which 9-folds were used to perform an inner 10-fold cross-validation 20 . The number of trees to grow and the number of predictors randomly sampled as candidates in each split was set to default 21 (number of trees = 500; number of predictors randomly selected = 2, 19 and 36), and the optimization criteria was maximization of the area under the of Receiver Operating Characteristic (ROC) curve, known as AUC 22 .…”
Section: Methodsmentioning
confidence: 99%
“…Random Forests were implemented by repeatedly fitting the model to 1000 resampled subsets of the data (100 repeats of 10-fold cross-validation). For each repetition, the dataset was divided into 10-cross-folds, of which 9-folds were used to perform an inner 10-fold cross-validation 20 . The number of trees to grow and the number of predictors randomly sampled as candidates in each split was set to default 21 (number of trees = 500; number of predictors randomly selected = 2, 19 and 36), and the optimization criteria was maximization of the area under the of Receiver Operating Characteristic (ROC) curve, known as AUC 22 .…”
Section: Methodsmentioning
confidence: 99%
“…Nh K-fold Cross-Validation involves splitting the data into k subsets. One of the k subsets is used as the validation set, while the other k-1 subsets are used as the training set [22][23].…”
Section: Train/testmentioning
confidence: 99%
“…The (𝑘 parameter) refers to the number of groups that the dataset will split into. In this experiment, we used 10-fold cross-validation [30], [31].…”
Section: B Dataset Splitmentioning
confidence: 99%