2020
DOI: 10.1088/1757-899x/778/1/012094
|View full text |Cite
|
Sign up to set email alerts
|

Artificial Neural Network Topology Optimization using K-Fold Cross Validation for Spray Drying of Coconut Milk

Abstract: In this study, the development of an optimized topology neural network model for spray drying coconut milk is investigated using K-fold cross validation technique. Performance between standalone ANN and ANN with K-fold cross validation is compared, as K-fold cross validation method is integrated into neural network to overcome the limitations of restricted dataset. With inlet temperature (140 °C-180 °C), concentration of maltodextrin and sodium caseinate (0 w/w %- 10w/w %) are established as the input paramete… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
4
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 24 publications
0
4
0
Order By: Relevance
“…Therefore, in case of limited input data, it constitutes one of the best approaches. According to Ming et al [48], the ANN model with K-fold crossvalidation gives a better prediction result compared to the stand-alone ANN model.…”
Section: Ann With K-fold Cross Validationmentioning
confidence: 99%
“…Therefore, in case of limited input data, it constitutes one of the best approaches. According to Ming et al [48], the ANN model with K-fold crossvalidation gives a better prediction result compared to the stand-alone ANN model.…”
Section: Ann With K-fold Cross Validationmentioning
confidence: 99%
“…We split the original dataset into a test dataset and a training dataset at a ratio of 3/10 and 7/10. To make the best use of the data, we choose the K-fold cross-validation method [14,15] to train the ANN [16].…”
Section: Jinst 17 P10033mentioning
confidence: 99%
“…To take full advantage of the limited dataset, the K-fold cross-validation technique was chosen when training the neural network (see figure 5). When training a network with K-fold crossvalidation [14,15], the test set is each k fold of the dataset, and the training set is the remaining (k-1) folds. Therefore, in the process of training the network, k times of training and verification…”
Section: K-fold Cross Validationmentioning
confidence: 99%
“…PSO optimization techniques were integrated into the ANN to determine the optimum weights in the neural network design using MATLAB version 2019a. Lastly, the developed PSO-ANN was compared with external ANN [27] and GA-ANN [28] based on MSE and R 2 evaluation and supported with sensitivity analysis.…”
Section: Framework Studymentioning
confidence: 99%
“…The ANN used is a multilayer perceptron (MLP) neural network with optimum configuration with a topology of 3-2-8-3, transfer function of logsig, and a Levenberg-Marquardt algorithm using K-Fold cross validation, which was established from research conducted by the previous authors [27]. Using normalized data, the optimal ANN design is determined using a selected network topology design, such as number of neurons (5)(6)(7)(8)(9)(10)(11)(12)(13)(14)(15), hidden neural layers (1-3), four different transfer functions, and seven training algorithms.…”
Section: Development Of Artificial Neural Networkmentioning
confidence: 99%