2021
DOI: 10.1016/j.jeconom.2020.02.007
|View full text |Cite
|
Sign up to set email alerts
|

Solving dynamic discrete choice models using smoothing and sieve methods

Abstract: We propose to combine smoothing, simulations and sieve approximations to solve for either the integrated or expected value function in a general class of dynamic discrete choice (DDC) models. We use importance sampling to approximate the Bellman operators defining the two functions. The random Bellman operators, and therefore also the corresponding solutions, are generally non-smooth which is undesirable. To circumvent this issue, we introduce a smoothed version of the random Bellman operator and solve for the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 27 publications
0
2
0
Order By: Relevance
“…Rust (1997a, pp. 35, 47) explained that this “problem can lead VNfalse(s,afalse) to be a poor estimate of the true expected value function” and that it “can be a much more serious problem in higher dimensional problems.” Kristensen, Mogensen, Myun Moon, and Schjerning (2021, p. 329) empirically confirmed this claim, reporting that the “variances of [Rust's] self‐approximating method increase dramatically as the number of state variables increases.”…”
Section: Introductionmentioning
confidence: 88%
“…Rust (1997a, pp. 35, 47) explained that this “problem can lead VNfalse(s,afalse) to be a poor estimate of the true expected value function” and that it “can be a much more serious problem in higher dimensional problems.” Kristensen, Mogensen, Myun Moon, and Schjerning (2021, p. 329) empirically confirmed this claim, reporting that the “variances of [Rust's] self‐approximating method increase dramatically as the number of state variables increases.”…”
Section: Introductionmentioning
confidence: 88%
“…Additional shape constraints, sparsity patterns, and better grid choices are useful in reducing computational burden. See, e.g., Chen (2007) discusses various sieve-based methods and Kristensen et al (2021) discuss various approximation architectures for approximating value functions in dynamic models. Second, there are also many model-specific techniques for approximating functions using a small number of basis functions.…”
Section: The Choice Of Tuning Parametersmentioning
confidence: 99%