2018
DOI: 10.1007/s42154-018-0045-5
|View full text |Cite
|
Sign up to set email alerts
|

Optimizing Neural Network Parameters Using Taguchi’s Design of Experiments Approach: An Application for Equivalent Stress Prediction Model of Automobile Chassis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 19 publications
0
2
0
Order By: Relevance
“…After the network has been trained, the optimal values of the ANN parameters are derived based on the performance statistics. When the Taguchi approach is employed to optimise the parameters, the ANN outperforms random parameter values (Patel and Bhatt, 2018).…”
Section: Introductionmentioning
confidence: 99%
“…After the network has been trained, the optimal values of the ANN parameters are derived based on the performance statistics. When the Taguchi approach is employed to optimise the parameters, the ANN outperforms random parameter values (Patel and Bhatt, 2018).…”
Section: Introductionmentioning
confidence: 99%
“…Using a statistical approach to optimize the performance of CNN models not only helps build optimum networks according to required criteria, but also makes models more explainable. Statistical experimental designs have been widely applied to various optimization problems [21][22][23] including parameter optimization of perceptron neural networks [24][25][26]. However, for proper optimization of the hyperparameters of deep learning networks, the design of experiment (DOE) is powerful technique, and, to our knowledge, this has never been investigated.…”
Section: Introductionmentioning
confidence: 99%