2007
DOI: 10.1007/s00158-007-0159-6
|View full text |Cite
|
Sign up to set email alerts
|

Toward an optimal ensemble of kernel-based approximations with engineering applications

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

2009
2009
2021
2021

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 80 publications
(16 citation statements)
references
References 31 publications
0
16
0
Order By: Relevance
“…The main idea behind model combination is to construct a combined model that yields better prediction accuracy than any single model. Usually one of the following two problems is tackled: building an ensemble of one particular model type with different hyperparameter settings [20] or building an ensemble of heterogeneous models with fixed hyperparameter settings [18,13,24]. Hyperparameters are parameters of model or learner.…”
Section: Model Combinationmentioning
confidence: 99%
“…The main idea behind model combination is to construct a combined model that yields better prediction accuracy than any single model. Usually one of the following two problems is tackled: building an ensemble of one particular model type with different hyperparameter settings [20] or building an ensemble of heterogeneous models with fixed hyperparameter settings [18,13,24]. Hyperparameters are parameters of model or learner.…”
Section: Model Combinationmentioning
confidence: 99%
“…Additionally, we also used the native neural networks MATLAB toolbox [45] and the code developed by Gunn [57] to run radial basis neural network and support vector regression algorithms, respectively. We use multiple instances of different surrogates in the same fashion of Viana et al [29] and Sanchez et al [58]. This is possible because kriging allows different instances by changing parameters such as basis and correlation functions.…”
Section: Appendix A: Cross Validation For Surrogate Selectionmentioning
confidence: 99%
“…Once a surrogate model has been chosen, there are many ways of introducing it into the original algorithm. Several approaches have been made in general surrogate-assisted evolutionary frameworks such as: integrating GAs with surrogate approximations (Queipo et al, 2005;Regis & Shoemaker, 2004) or landscape approximations (Knowles, 2006), the use of surrogate-guided evolutionary operators (Rasheed, 2002), surrogate-assisted local search (Lim et al, 2008;Wanner et al 2008), accelerating the optimization process using surrogate models, pre-selection approaches (Giannakoglou, 2002;Praveen & Duvigneau, 2009), multiple surrogates (Acar & Rais-Rohani, 2008;Lim et al, 2008;Sanchez et al 2007), and coevolution of fitness predictors (Schmidt & Lipson, 2008). However, no Surrogate-Assisted AIS algorithm seems to have been proposed in the literature so far.…”
Section: Surrogate-assisted Artificial Immune Systemmentioning
confidence: 99%