2015 3rd International Symposium on Computational and Business Intelligence (ISCBI) 2015
DOI: 10.1109/iscbi.2015.9
|View full text |Cite
|
Sign up to set email alerts
|

Effects of Weight Initialization in a Feedforward Neural Network for Classification Using a Modified Genetic Algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(5 citation statements)
references
References 37 publications
0
5
0
Order By: Relevance
“…Therefore, it is advisable to find an optimal weight matrix that performs better on a certain architecture for a complex lithofacies identification problem. Recently, a number of global optimization algorithms have been applied to optimize the initialization weights and biases of deep learning network (Nienhold et al ., 2015; Paul Ijjina and Krishna Mohan, 2016; Khalifa et al ., 2017; Yamasaki et al ., 2017; Wang et al ., 2018; Itano et al ., 2018; Li and Liu, 2019).…”
Section: Discussionmentioning
confidence: 99%
“…Therefore, it is advisable to find an optimal weight matrix that performs better on a certain architecture for a complex lithofacies identification problem. Recently, a number of global optimization algorithms have been applied to optimize the initialization weights and biases of deep learning network (Nienhold et al ., 2015; Paul Ijjina and Krishna Mohan, 2016; Khalifa et al ., 2017; Yamasaki et al ., 2017; Wang et al ., 2018; Itano et al ., 2018; Li and Liu, 2019).…”
Section: Discussionmentioning
confidence: 99%
“…The accuracy of the results was 86.41% in [8] and 85.62% in [9], respectively. Nienhold D et al [10] used genetic algorithm and feedforward neural network to classify and identify EEG data, and the error rate was reduced by about 10%. Alomari et al [11] used Coiflets wavelet to analyze and process the EEG signals, and used amplitude estimation to extract the features, and finally employed machine learning algorithms for classification and recognition.…”
Section: Related Workmentioning
confidence: 99%
“…According to the empirical formula, the number of hidden layer neurons can be calculated and located at the interval [10,20]. This is an approach to verify the rationality of the number of hidden layer neurons in our experiment.…”
Section: Data Preprocessingmentioning
confidence: 99%
See 1 more Smart Citation
“…Then finally, the characters were segmented using a contour-based bounding box segmentation method [16].In phase two, the GLCM feature extraction technique was applied to the segmented character to extract the statistical features of an image which includes contrast, energy, homogeneity, correlation, dissimilarity and Angular Second Moment (ASM). By obtaining these features, the network was trained using RBF [17] with the Nguyen-Widrow weight initialization method [19] and the CART algorithm as shown in Fig. 3 and Fig.4.…”
Section: Proposed Workmentioning
confidence: 99%