2011
DOI: 10.1007/978-3-642-16218-3
|View full text |Cite
|
Sign up to set email alerts
|

Evolutionary Statistical Procedures

Abstract: The use of general descriptive names, registered names, trademarks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
7
0

Year Published

2011
2011
2022
2022

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 30 publications
(7 citation statements)
references
References 0 publications
0
7
0
Order By: Relevance
“…This fact is particularly important for the numerical distributed models widely applied in geosciences [80] where no closed form is available for the posterior conditionals. However, the use of genetic operators as proposal functions that has given rise to the Evolutionary MCMC algorithms [81,82] is probably the most promising computational tool for the sampling unnormalized PDFs. The differential evolution adaptive metropolis (DREAM) Markov Chain Monte Carlo (MCMC) scheme [83,84] has been applied to hydrological sciences due to its better convergence and mixing properties [85].…”
Section: Markov Chain Monte Carlo Sampling Methodsmentioning
confidence: 99%
“…This fact is particularly important for the numerical distributed models widely applied in geosciences [80] where no closed form is available for the posterior conditionals. However, the use of genetic operators as proposal functions that has given rise to the Evolutionary MCMC algorithms [81,82] is probably the most promising computational tool for the sampling unnormalized PDFs. The differential evolution adaptive metropolis (DREAM) Markov Chain Monte Carlo (MCMC) scheme [83,84] has been applied to hydrological sciences due to its better convergence and mixing properties [85].…”
Section: Markov Chain Monte Carlo Sampling Methodsmentioning
confidence: 99%
“…A new population is formed by selecting fitter individuals from the parent population and the children population. After several generations (iteration), the algorithm converges to the best individual, which hopefully represents a (globally) optimal solution to the problem (Baragona et al (2011) and Gen and Cheng (2000)). …”
Section: Evolutionary Feature Selectionmentioning
confidence: 99%
“…The SVM parameters are optimized by using an evolutionary algorithm, the so-called Genetic Algorithm (GA) introduced by Holland (1975). Some recent papers that deal with GA are Michalewicz (1996), Gen and Cheng (2000), Melanie (1999), Haupt and Haupt (2004), Sivanandam and Deepa (2008) and Baragona et al (2011).…”
mentioning
confidence: 99%
“…A promising solution is combining metaheuristic algorithms and statistical models. This is a novel research area that addresses problems characterized by large design space, high-order interactions between variables, and, complex and non-linear experimental surfaces [15]. Some examples of these hybridized methods can be found in [16][17][18][19][20][21] approach is proposed in [21] where an optimization algorithm, called Co-Information Composite Likelihood (COIL), based on the evolutionary paradigm is introduced by coupling cross-entropy sampling [22] and composite likelihood principles [23] to design novel enzymes with improved functionalities.…”
Section: Introductionmentioning
confidence: 99%