2021
DOI: 10.1080/00218464.2021.2001335
|View full text |Cite
|
Sign up to set email alerts
|

A Bayesian regularization-backpropagation neural network model for peeling computations

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
15
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 48 publications
(15 citation statements)
references
References 37 publications
0
15
0
Order By: Relevance
“…As compared with some algorithms to improve ANN efficiency, Bayesian regularization shows superiority over Levenberg-Marquardt algorithm [26,52] and gradient descent with momentum and adaptive learning rate backpropagation -GDX [53], Scaled Conjugate Gradient (SCG) [54]. Besides, Gouravarajua [27] concludes that the artificial neural network-based Bayesian regularization combined with the K-fold cross-section technique which can greatly reduce computation time with high accuracy can be used to successfully study gecko adhesion problems. Thus, it is shown that utilizing the ANN-based Bayesian regularization model to estimate the compressive strength of concrete at high temperatures is achievable, hence saving time and money on trials.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…As compared with some algorithms to improve ANN efficiency, Bayesian regularization shows superiority over Levenberg-Marquardt algorithm [26,52] and gradient descent with momentum and adaptive learning rate backpropagation -GDX [53], Scaled Conjugate Gradient (SCG) [54]. Besides, Gouravarajua [27] concludes that the artificial neural network-based Bayesian regularization combined with the K-fold cross-section technique which can greatly reduce computation time with high accuracy can be used to successfully study gecko adhesion problems. Thus, it is shown that utilizing the ANN-based Bayesian regularization model to estimate the compressive strength of concrete at high temperatures is achievable, hence saving time and money on trials.…”
Section: Resultsmentioning
confidence: 99%
“…Like other ML methods, ANN can suffer from an overfitting problem (especially with too small data but too high model complexity) [26]. The regularization in the ANN allows reducing error for obtaining the highest coefficient of correlation and lowest total square errors [27]. A tuning regularization technique in ANN that has been used very effectively is Bayesian regularization which has been used to successfully study various problems such as stock price prediction, data mining, etc [28][29][30][31][32].…”
Section: Introductionmentioning
confidence: 99%
“…For simplicity, we use The ID_BB and ID_ker variables in lines [1,2] to assign new in order IDs for the BBs and the kernels. In lines [5][6][7][8][9][10][11][12][13][14], we go over all the kernels and the basic blocks in the application and inject a function after each basic block to collect the IDs of the BB and the kernels. In lines [15][16][17][18][19][20][21], we use another function to receive the IDs data dynamically and calculate the BB counts.…”
Section: Bb Trace Extractionmentioning
confidence: 99%
“…Although the trained networks showed good accuracy for some of the applications, they could not fit some of the benchmarks of Table 1. Therefore, we employ the Bayesian Regularization Backpropagation Neural Network(BR-BPNN) that uses the Bayesian Regularization algorithm and minimizes a linear combination of squared errors and weights [14] to compare PNN to a competitive regression model.…”
Section: Deep Neural Network Model Architecturementioning
confidence: 99%
“…It is a linear hybrid of Bayesian techniques and ANN to automatically choose the values of the appropriate weights while training the ANN model (Okut, 2016). A more detailed discussion can be found in Gouravaraju et al (2021). The second was called ANN-2 and trained with the Levenberg-Marquardt backpropagation, known as the TRAINLM function in MATLAB.…”
Section: đť‘“(đť‘Ąmentioning
confidence: 99%