2013
DOI: 10.1080/03610918.2012.700364
|View full text |Cite
|
Sign up to set email alerts
|

Comparison of Random Search Method and Genetic Algorithm for Stratification

Abstract: This note shows that the genetic algorithm proposed by Keskintürk and Er (Computational Statistics and Data Analysis 2007, 52, 53-67) usually provides similarly effective stratification to that of the random search algorithm proposed by Kozak (Statistics in Transition 2004, 6(5), 797-806), although in some situations it can be noticeably less effective. Despite this, it is suggested that quite likely genetic algorithms have potential to be a means of very efficient stratification, especially in complex strat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 12 publications
(9 citation statements)
references
References 8 publications
0
9
0
Order By: Relevance
“…Since LH only works on univariate administrative data, strata can be identified as a set of disjoint intervals of the real line. Two approaches to find the boundaries of these intervals are found in Baillargeon and Rivest (2011): a model based approach used in the original Lavalle ´e and Hidiroglou (1988) paper, and a random search method proposed in Kozak (2004). Due to the excellent performance characteristics without model assumptions, the random search method was chosen for this example.…”
Section: Univariate Examplementioning
confidence: 99%
“…Since LH only works on univariate administrative data, strata can be identified as a set of disjoint intervals of the real line. Two approaches to find the boundaries of these intervals are found in Baillargeon and Rivest (2011): a model based approach used in the original Lavalle ´e and Hidiroglou (1988) paper, and a random search method proposed in Kozak (2004). Due to the excellent performance characteristics without model assumptions, the random search method was chosen for this example.…”
Section: Univariate Examplementioning
confidence: 99%
“…By efficiently combining the solutions to the overall problem to be found. This technique is widely applied in various fields such as algorithm design, artificial intelligence, economics and bioinformatics, providing an efficient approach to tackle complex problem into optimal solutions as given in (11). Now let us assume a fraction of problem defined in (11) for (𝑙 1 , 𝑚 1 ) < (𝐿, 𝑀)…”
Section: Procedures For Obtaining the Solution Using Dynamic Programmingmentioning
confidence: 99%
“…Addressing the multivariate stratification problem, [10] introduced an algorithm that utilizes a penalized objective function optimized via the Simulated Annealing technique. Another approach, presented in [11], introduces algorithms for stratifying asymmetric populations using power allocation to estimate sample sizes, [12] utilized Dynamic Programming and Neyman allocation to tackle the stratification issue, assuming a Weibull distribution for the stratification variable. Several R packages, including GA4 Stratification (https://cran.rproject.org/src/contrib/Archive/GA4Stratification/), stratify R ( [13]), and sample (available on the R CRAN), offer tools for stratification.…”
Section: Introductionmentioning
confidence: 99%
“…The optimum stratification problem, related to the field of probability sampling [8], can be formulated according to two possible goals: (A) minimizing the variance of an estimator given a fixed sample size or (B) minimizing the sample size for a fixed level of precision. In the literature, most methods were developed aiming at the first goal [2,3,5,6,10,15,18,22,25,26,30,[37][38][39], while the second goal has been less studied [23,24,28,33,35]. This article's optimization problem consists of minimizing the total sample size, simultaneously satisfying the constraints of precision and minimum sample size of each stratum.…”
Section: Introductionmentioning
confidence: 99%