2020
DOI: 10.1002/sim.8659
|View full text |Cite
|
Sign up to set email alerts
|

Balancing vs modeling approaches to weighting in practice

Abstract: There are two seemingly unrelated approaches to weighting in observational studies. One of them maximizes the fit of a model for treatment assignment to then derive weights-we call this the modeling approach. The other directly optimizes certain features of the weights-we call this the balancing approach. The implementations of these two approaches are related: the balancing approach implicitly models the propensity score, while instances of the modeling approach impose balance conditions on the covariates use… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

3
95
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 73 publications
(98 citation statements)
references
References 73 publications
(195 reference statements)
3
95
0
Order By: Relevance
“…Since ( 5) is equivalent to modeling ρ(x)/(1 − ρ(x)) with exponential tilting, the weights constructed in this way constitute estimates of w * (a, x). However, this approach is vulnerable to estimation error in the estimated propensity score and can result in extreme weights as one may encounter in the conventional inverse propensity score weights (Kang et al, 2007;Chattopadhyay et al, 2020). In Section 5, we systematically compare this method with the proposed weights (8), and the results suggest that our entropy balancing weighting approach leads to more favorable performance.…”
Section: Existing Weighting Approaches For Causal Generalizationmentioning
confidence: 99%
“…Since ( 5) is equivalent to modeling ρ(x)/(1 − ρ(x)) with exponential tilting, the weights constructed in this way constitute estimates of w * (a, x). However, this approach is vulnerable to estimation error in the estimated propensity score and can result in extreme weights as one may encounter in the conventional inverse propensity score weights (Kang et al, 2007;Chattopadhyay et al, 2020). In Section 5, we systematically compare this method with the proposed weights (8), and the results suggest that our entropy balancing weighting approach leads to more favorable performance.…”
Section: Existing Weighting Approaches For Causal Generalizationmentioning
confidence: 99%
“…(see, e.g., Chattopadhyay et al 2020). As a special case, when both m 1 and m 0 are linear in X i , balancing the mean of X i relative to X suffices to remove the bias of T .…”
Section: Finite Sample Propertiesmentioning
confidence: 99%
“…Negative weights are difficult to interpret and, moreover, they can produce estimates that are an extrapolation outside (instead of an interpolation inside) of the support of the available data. In other words, negative weights can produce estimates that are not sample bounded in the sense of Robins et al (2007) (see also Chattopadhyay et al 2020). In some settings, there is no alternative to using negative weights in order to adjust for or balance certain features of the distributions of the observed covariates.…”
Section: Extrapolationmentioning
confidence: 99%
“…This will reduce the variance of the weights in exchange for the addition of some bias (Owen, 2013a), but there is evidence that this leads to a lower mean-squared error for the target estimand (Wang and Zubizarreta, 2019a;Huling and Mak, 2020;Chattopadhyay et al, 2020).…”
Section: Weighting Estimatorsmentioning
confidence: 99%
“…where k is the number of basis functions, or ‹ can be selected by first running a stable balancing weights tuning algorithm (Wang and Zubizarreta, 2019a;Chattopadhyay et al, 2020) and then using the selected ‹.…”
Section: Hyperparameter Tuningmentioning
confidence: 99%