2019
DOI: 10.1214/18-sts693
|View full text |Cite
|
Sign up to set email alerts
|

Models as Approximations I: Consequences Illustrated with Linear Regression

Abstract: In the early 1980s, Halbert White inaugurated a "model-robust" form of statistical inference based on the "sandwich estimator" of standard error. This estimator is known to be "heteroskedasticity-consistent," but it is less well known to be "nonlinearity-consistent" as well. Nonlinearity, however, raises fundamental issues because in its presence regressors are not ancillary, hence cannot be treated as fixed. The consequences are deep: (1) population slopes need to be reinterpreted as statistical functionals o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
56
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
8
2

Relationship

0
10

Authors

Journals

citations
Cited by 81 publications
(58 citation statements)
references
References 51 publications
2
56
0
Order By: Relevance
“…Generalized linear models are far and away the primary estimation tool deployed in quantitative sociology, yet many sociologists will admit that the functional form assumptions of these models are far from perfect. The field's awareness of this problem is evident in proposals to assess robustness across model specifications (Young and Holsteen, 2017) as well as perspectives that view any regression model as an approximation (Aronow and Miller, 2019;Berk et al, 2019;Buja et al, 2019). The appeal of new machine learning tools (Molina and Garip, 2019) and predictive exercises (Watts, 2014) derives from how these tools present an opportunity to break out of the parametric models that we all know are imperfect.…”
Section: Estimation: Learn the Empirical Estimand From Datamentioning
confidence: 99%
“…Generalized linear models are far and away the primary estimation tool deployed in quantitative sociology, yet many sociologists will admit that the functional form assumptions of these models are far from perfect. The field's awareness of this problem is evident in proposals to assess robustness across model specifications (Young and Holsteen, 2017) as well as perspectives that view any regression model as an approximation (Aronow and Miller, 2019;Berk et al, 2019;Buja et al, 2019). The appeal of new machine learning tools (Molina and Garip, 2019) and predictive exercises (Watts, 2014) derives from how these tools present an opportunity to break out of the parametric models that we all know are imperfect.…”
Section: Estimation: Learn the Empirical Estimand From Datamentioning
confidence: 99%
“…In practice, researchers often use a linear instrumentexposure model. From Vansteelandt and Didelez (2018) and Buja, Brown, Berk, George, Pitkin, Traskin, Zhang, and Zhao (2019), we know that the TSLS estimator is consistent even when the relationship between the treatment and the instrument is misspecified.…”
Section: Modelmentioning
confidence: 97%
“…Even if it does, then overly optimistic inferences are typically obtained (notably even when robust standard errors are used). This is the result of excess variability that most estimators exhibit when models are misspecified (Buja et al, 2019; or when variable selection procedures are employed to construct a well-fitting model (Leeb and Pötscher, 2006;.…”
Section: The Tradition Within Statisticsmentioning
confidence: 99%