2012
DOI: 10.1016/j.jeconom.2012.05.010
|View full text |Cite
|
Sign up to set email alerts
|

Efficient minimum distance estimation with multiple rates of convergence

Abstract: Abstract:This paper extends the asymptotic theory of GMM inference to allow sample counterparts of the estimating equations to converge at (multiple) rates, different from the usual square-root of the sample size. In this setting, we provide consistent estimation of the structural parameters. In addition, we define a convenient rotation in the parameter space (or reparametrization) which permits to disentangle the different rates of convergence. More precisely, we identify special linear combinations of the st… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
38
0

Year Published

2013
2013
2024
2024

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 47 publications
(40 citation statements)
references
References 35 publications
2
38
0
Order By: Relevance
“…Fortunately, Gagliardini et al (2011) and Antoine and Renault (2012) have developed XMM, an extension of GMM, to explicitly incorporate moments with different rates of convergence. Fortunately, Gagliardini et al (2011) and Antoine and Renault (2012) have developed XMM, an extension of GMM, to explicitly incorporate moments with different rates of convergence.…”
Section: Estimation: Xmmmentioning
confidence: 99%
See 2 more Smart Citations
“…Fortunately, Gagliardini et al (2011) and Antoine and Renault (2012) have developed XMM, an extension of GMM, to explicitly incorporate moments with different rates of convergence. Fortunately, Gagliardini et al (2011) and Antoine and Renault (2012) have developed XMM, an extension of GMM, to explicitly incorporate moments with different rates of convergence.…”
Section: Estimation: Xmmmentioning
confidence: 99%
“…Following Gagliardini et al (2011) and Antoine and Renault (2012), we assume local identification in the sense of Rothenberg (1971). That is, 0 is locally identified if there exists an open neighborhood of 0 containing no other that can generate the same distribution.…”
Section: Moment Conditionsmentioning
confidence: 99%
See 1 more Smart Citation
“…We suppose that J may have infinite cardinality, or it may have a cardinality that is increasing with sample size n. Han and Phillips (2006) consider a similar setup except that they combine the moment conditions in the classical GMM way, i.e., (2.2), using identity weighting; they also allow for the possibility that some of their moment conditions provide only weak identification. Lee (2010) considers the case where τ is fixed but each estimator may have a different rate of convergence (see also Antoine and Renault, 2012). We work with a situation where each estimator is √ n-consistent, however, in our case the asymptotic variance V j j of the j th estimator may increase to infinity with j , which essentially reflects the same phenomenon of different precision across the estimators.…”
Section: )mentioning
confidence: 99%
“…The rotation technique we use in our asymptotic derivations has many antecedents in the literature. For example, Sargan (), Phillips () and Choi and Phillips () used similar rotations to derive limit theory for estimators under identification failure; Antoine and Renault (, ) used similar rotations to derive limit theory for estimators under “nearly‐weak” identification; Andrews and Cheng () (AC14 hereafter) used similar rotations to find the asymptotic distributions of Wald statistics under weak and nearly‐strong identification; and recently Phillips () used similar rotations to find limit theory for regression estimators in the presence of near‐multicollinearity in regressors. However, unlike their predecessors used for specific linear models, our nonlinear reparameterizations are not generally equivalent to the rotations we use to derive asymptotic theory.…”
Section: Introductionmentioning
confidence: 99%