2014
DOI: 10.3102/0034654313500826
|View full text |Cite
|
Sign up to set email alerts
|

Meta-Analysis With Complex Research Designs

Abstract: Previous research has shown that treating dependent effect sizes as independent inflates the variance of the mean effect size and introduces bias by giving studies with more effect sizes more weight in the meta-analysis. This article summarizes the different approaches to handling dependence that have been advocated by methodologists, some of which are more feasible to implement with education research studies than others. A case study using effect sizes from a recent meta-analysis of reading interventions is … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

3
90
0

Year Published

2015
2015
2020
2020

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 223 publications
(93 citation statements)
references
References 34 publications
3
90
0
Order By: Relevance
“…There were 223 total effect sizes (i.e., total k =223). The number of effect sizes per study that we obtained is relatively common in social science research (Scammacca et al, 2014) and is similar to the number of effect sizes per study seen in similar meta-analyses (Shields et al, 2015). Stress effects on working memory were examined in 34 studies ( k =164) with 1,353 participants.…”
Section: Resultssupporting
confidence: 76%
See 1 more Smart Citation
“…There were 223 total effect sizes (i.e., total k =223). The number of effect sizes per study that we obtained is relatively common in social science research (Scammacca et al, 2014) and is similar to the number of effect sizes per study seen in similar meta-analyses (Shields et al, 2015). Stress effects on working memory were examined in 34 studies ( k =164) with 1,353 participants.…”
Section: Resultssupporting
confidence: 76%
“…Multiple outcomes are a problem for conventional meta-analytic methods, as averaging effect sizes within studies without accounting for their correlations can alter or obscure true effect size estimates (Borenstein et al, 2009; Scammacca et al, 2014). Thus, we employed the meta-analytic technique of robust variance estimation, a random-effects meta-regression that can account for dependence between effect size estimates (Hedges et al, 2010; Tanner-Smith and Tipton, 2014).…”
Section: Methodsmentioning
confidence: 99%
“…The present study applied a systematic and rigorous meta-analytic method and examined high quality longitudinal studies of varying duration. In an attempt to more precisely synthesize and quantify the association of peer behavior with smoking initiation and continuation, we also employed the robust variance estimation approach (RVE) with small-sample corrections, a mathematically sound and well-validated method for modeling within-study dependence among effect sizes (Hedges et al, 2010; Samson et al, 2012; Scammacca, Roberts, & Stuebing, 2014; Tanner-Smith & Tipton, 2014; Tipton, 2015). Finally, examining potential moderators of the effect allows us to advance theories of social influence on risk taking during adolescence.…”
Section: Discussionmentioning
confidence: 99%
“…Multiple outcomes are a problem for conventional meta-analytic methods, as averaging effect sizes within studies without accounting for their correlations can alter or obscure true effect size estimates (Borenstein, Hedges, Higgins, & Rothstein, 2009; Scammacca, Roberts, & Stuebing, 2014). Thus, we employed the meta-analytic technique of robust variance estimation, a random-effects meta-regression that can account for dependence between effect size estimates (Hedges, Tipton, & Johnson, 2010; Tanner-Smith & Tipton, 2014).…”
Section: Methodsmentioning
confidence: 99%