2021
DOI: 10.1016/j.jeconom.2020.07.048
|View full text |Cite
|
Sign up to set email alerts
|

Simple and trustworthy cluster-robust GMM inference

Abstract: This paper develops a new asymptotic theory for two-step GMM estimation and inference in the presence of clustered dependence. The key feature of alternative asymptotics is the number of clusters G is regarded as small or …xed when the sample size increases. Under the small-G asymptotics, this paper shows the centered two-step GMM estimator and the two continuously-updating GMM estimators we consider have the same asymptotic mixed normal distribution. In addition, the J statistic, the trinity of two-step GMM s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

4
18
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
2

Relationship

3
5

Authors

Journals

citations
Cited by 12 publications
(22 citation statements)
references
References 61 publications
4
18
0
Order By: Relevance
“…We extend the asymptotic convergence in distribution of single-stage and two-stage estimators from Newey and McFadden (1994) and Hwang (2021) to weakly approaching sequences of distributions. As explained in Section 3, p. 2142, Newey and McFadden (1994) As described above, asymptotic normality results from convergence in probability of the Hessian, convergence in distribution of the average score, and the Slutzky theorem.…”
Section: Breakdown Of the Following Stepsmentioning
confidence: 98%
See 1 more Smart Citation
“…We extend the asymptotic convergence in distribution of single-stage and two-stage estimators from Newey and McFadden (1994) and Hwang (2021) to weakly approaching sequences of distributions. As explained in Section 3, p. 2142, Newey and McFadden (1994) As described above, asymptotic normality results from convergence in probability of the Hessian, convergence in distribution of the average score, and the Slutzky theorem.…”
Section: Breakdown Of the Following Stepsmentioning
confidence: 98%
“…We introduce a regularity assumption on the latent trawl set to allow the application of the central limit theorem from Hwang (2021). It unfolds as follows Assumption 2.…”
Section: Assumptions For the Single-stage Asymptoticsmentioning
confidence: 99%
“…Hansen and Lee (2019, Theorem 12) provides a similar result for GMM estimation, which is also very widely applicable. More recently, the fixed-G approach discussed in Section 4.2 has been applied to GMM estimation by Hwang (2021). It leads to a novel inferential procedure that involves modifying the usual asymptotic t and F statistics, but it requires that cluster sizes be approximately equal.…”
Section: Cluster-robust Inference In Nonlinear Modelsmentioning
confidence: 99%
“…However, as emphasized by Ibragimov andMüller (2010, 2016), Bester, Conley, and Hansen (2011), Cameron and Miller (2015), Canay, Romano, and Shaikh (2017), Hagemann (2019aHagemann ( ,b, 2020 and Canay et al (2020), many empirical studies motivate an alternative framework in which the number of clusters is small, while the number of observations in each cluster is relatively large. For inference, we may consider applying the approaches developed by Bester et al (2011), Hwang (2020), Ibragimov andMüller (2010, 2016), and Canay et al (2017). However, Bester et al (2011) and Hwang (2020) require (asymptotically) equal cluster-level sample size, while Ibragimov andMüller (2010, 2016) and Canay et al (2017) hinge on strong identification for all clusters.…”
Section: Introductionmentioning
confidence: 99%
“…For inference, we may consider applying the approaches developed by Bester et al (2011), Hwang (2020), Ibragimov andMüller (2010, 2016), and Canay et al (2017). However, Bester et al (2011) and Hwang (2020) require (asymptotically) equal cluster-level sample size, while Ibragimov andMüller (2010, 2016) and Canay et al (2017) hinge on strong identification for all clusters. Our bootstrap Wald test is more flexible as it does not require equal cluster size and only needs strong identification in one of the clusters.…”
Section: Introductionmentioning
confidence: 99%