2011
DOI: 10.1016/j.chemolab.2010.11.007
|View full text |Cite
|
Sign up to set email alerts
|

Neural network ensemble modeling for nosiheptide fermentation process based on partial least squares regression

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
14
0

Year Published

2011
2011
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 24 publications
(14 citation statements)
references
References 18 publications
0
14
0
Order By: Relevance
“…For the three datasets, the average method resulted in the following vector of weights w = [0. 25 As can be seen from Tables III-V, the proposed IOWA fusion method exhibits an interesting behavior as it provides an NMSE very close to or better than the best estimator of the ensemble. It also outperformed the state-of-the-art method called residualbased correction [30], for the Tecator and wine datasets.…”
Section: Fusion With Induced Ordered Weighted Averaging Operatorsmentioning
confidence: 76%
“…For the three datasets, the average method resulted in the following vector of weights w = [0. 25 As can be seen from Tables III-V, the proposed IOWA fusion method exhibits an interesting behavior as it provides an NMSE very close to or better than the best estimator of the ensemble. It also outperformed the state-of-the-art method called residualbased correction [30], for the Tecator and wine datasets.…”
Section: Fusion With Induced Ordered Weighted Averaging Operatorsmentioning
confidence: 76%
“…As shown in Figure 1, the Elman NN consists of the context layer, input layer, hidden layer, and output layer [29][30][31]. 1 denotes the weight from the context layer to the hidden layer, W 2 denotes the weight from the input layer to the hidden layer, and W 3 denotes the weight from the hidden layer to the output layer.…”
Section: Quality Prediction Model Based On Elman Neural Network Ensemmentioning
confidence: 99%
“…Generally, optimization methods can overcome the deficiencies of NNs. Additionally, the NN ensemble is a technique that can significantly improve the generalization ability of NNs through training a number of NNs and then combining them ( [31], He and Cao, 2012). Generally, the most common methods for the NN ensemble model are simple averaging and weighted averaging for regression problems [34].…”
Section: Quality Prediction Model Based On Elman Neural Network Ensemmentioning
confidence: 99%
“…18 The goal of the combination is not only to combine the selected individual learners' results as a final result of the ensemble but also to optimally realize the complementariness of the participants from their diversity. 18,19 Generally, the most common methods are simple averaging or weighted averaging 19 for regression problems and almost all the individuals are using the same input as the training data. 20 This article focuses on an indoor localization system based on an NN ensemble model.…”
Section: Introductionmentioning
confidence: 99%