1989
DOI: 10.2307/2290059
|View full text |Cite
|
Sign up to set email alerts
|

Choosing Among Alternative Nonexperimental Methods for Estimating the Impact of Social Programs: The Case of Manpower Training

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

4
289
2
6

Year Published

1990
1990
2019
2019

Publication Types

Select...
9

Relationship

1
8

Authors

Journals

citations
Cited by 441 publications
(301 citation statements)
references
References 9 publications
4
289
2
6
Order By: Relevance
“…These and other studies reported in the literature more than 20 years ago fueled the flight of many empirical economists from structural models, even though Heckman and Hotz (1989) cautioned that many applications of the structural approach by those comparing structural estimates with experimental estimates did not perform specification tests to see if the estimated structural models were concordant with the pre-program data. They show that when such tests are performed, the surviving structural models closely match the estimates produced from the experiment analyzed by LaLonde, findings duplicated for other experiments (see Todd and Wolpin, 2006, Attanasio, Meghir, and Santiago, 2009, Attanasio, Meghir, and Santiago, 2009, and the discussion in Keane, Todd, and Wolpin, 2010).…”
Section: Introductionmentioning
confidence: 99%
“…These and other studies reported in the literature more than 20 years ago fueled the flight of many empirical economists from structural models, even though Heckman and Hotz (1989) cautioned that many applications of the structural approach by those comparing structural estimates with experimental estimates did not perform specification tests to see if the estimated structural models were concordant with the pre-program data. They show that when such tests are performed, the surviving structural models closely match the estimates produced from the experiment analyzed by LaLonde, findings duplicated for other experiments (see Todd and Wolpin, 2006, Attanasio, Meghir, and Santiago, 2009, Attanasio, Meghir, and Santiago, 2009, and the discussion in Keane, Todd, and Wolpin, 2010).…”
Section: Introductionmentioning
confidence: 99%
“…27 It is for this reason that the similarity of pre-intervention trends in outcomes is often considered a necessary but insufficient condition for the validity of DiD methods. 28 The notable contributions in this area are all from the US: see Heckman and Hotz (1989), Friedlander and Robins (1995), Heckman and Smith (1995), Lalonde (1986), and Dehejia and Wahba (1999). 29 For more information, see http://www.ifs.org.uk/centres/ PEPA.…”
Section: Resultsmentioning
confidence: 99%
“…Goldberger (1972) was the first to consider a similar situation and argued that researchers should be cautious in interpreting the impact of a social program due to a potential bias. Heckman and Hotz ( 1989) show that when assignment to probation is nonrandom, &dquo;selection bias&dquo; in the estimation of a in Equation 1 can arise due to dependence between d, and u,. In other words, selection bias is present if And consequently, A stochastic relationship between d, and u,, of course, can arise for a variety of reasons.…”
Section: Correcting Selection and Censoring Biasesmentioning
confidence: 95%