2023
DOI: 10.1002/cjs.11765
|View full text |Cite
|
Sign up to set email alerts
|

PCA Rerandomization

Abstract: Mahalanobis distance of covariate means between treatment and control groups is often adopted as a balance criterion when implementing a rerandomization strategy. However, this criterion may not work well for high‐dimensional cases because it balances all orthogonalized covariates equally. We propose using principal component analysis (PCA) to identify proper subspaces in which Mahalanobis distance should be calculated. Not only can PCA effectively reduce the dimensionality for high‐dimensional covariates, but… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 38 publications
0
4
0
Order By: Relevance
“…Schultzberg & Johansson (2019) developed an allocation scheme that stratifies on binary covariates followed by rerandomization on the continuous covariates. Zhang, Yin & Rubin (2021) presented a rerandomization strategy for high‐dimensional data, which first uses principal component analysis to identify proper strata and subsets of covariates for which rerandomization should be used. Wang, Wang & Liu (2023) extended the finite population asymptotic results to stratified experiments.…”
Section: Conclusion and Discussionmentioning
confidence: 99%
“…Schultzberg & Johansson (2019) developed an allocation scheme that stratifies on binary covariates followed by rerandomization on the continuous covariates. Zhang, Yin & Rubin (2021) presented a rerandomization strategy for high‐dimensional data, which first uses principal component analysis to identify proper strata and subsets of covariates for which rerandomization should be used. Wang, Wang & Liu (2023) extended the finite population asymptotic results to stratified experiments.…”
Section: Conclusion and Discussionmentioning
confidence: 99%
“…Again, power analyses will be notationally complex in order to incorporate the criterion for each tier. Other examples include criteria modified by ridge penalties (Branson & Shao, 2021) or principal component analysis (Zhang et al, 2023), which have been shown to increase precision in high-dimensional settings. We suspect that testing power may increase as well.…”
Section: Discussionmentioning
confidence: 99%
“…Since , many works have established the benefits of rerandomization; this includes experiments with tiers of covariates (Morgan & Rubin, 2015), sequential experiments (Zhou et al, 2018), factorial experiments (Branson et al, 2016;Li et al, 2020), stratified experiments (Wang et al, 2021), experiments with clusters (Lu et al, 2023) and experiments with high-dimensional covariates (Branson & Shao, 2021;Wang & Li, 2022;Zhang et al, 2023). A common theme is that causal effect estimators are more precise under rerandomization than complete randomization as long as covariates are associated with outcomes.…”
Section: Introductionmentioning
confidence: 99%
“…We use the Mahalanobis distance as a balance measure. It is straightforward to extend our methods to RR using other balance measures, such as the Mahalanobis distance within tiers of covariate importance (Morgan and Rubin, 2015), rank‐based balance measure with estimated weights of the covariates (Johansson and Schultzberg, 2020), ridge RR (Branson and Shao, 2021), and PCA RR (Zhang et al., 2021).…”
Section: Discussionmentioning
confidence: 99%