2021
DOI: 10.1016/j.jocm.2021.100323
|View full text |Cite
|
Sign up to set email alerts
|

Evaluating the predictive abilities of mixed logit models with unobserved inter- and intra-individual heterogeneity

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 21 publications
(3 citation statements)
references
References 33 publications
0
3
0
Order By: Relevance
“…One of the maintained conditions in this research is that individual parameters are considered constant across a set of replications. This restriction can of course be removed, as suggested by Swait et al (2016) for latent class (mixture) models, and more recently, by Kreuger et al (2021) for the Mixed Logit model. Both allow for intra-person variation in preference parameters, over and above inter-personal preference heterogeneity.…”
Section: Discussionmentioning
confidence: 99%
“…One of the maintained conditions in this research is that individual parameters are considered constant across a set of replications. This restriction can of course be removed, as suggested by Swait et al (2016) for latent class (mixture) models, and more recently, by Kreuger et al (2021) for the Mixed Logit model. Both allow for intra-person variation in preference parameters, over and above inter-personal preference heterogeneity.…”
Section: Discussionmentioning
confidence: 99%
“…where C n is the choice set; X it is the observed variable vector for mode i in scenario t; β ni is the corresponding coefficient vector for traveler n; and ε nit is the unobserved part of utility and is assumed to be independently and identically distributed according to Gumbel (0, 1) across travelers, choice scenarios, and modes. The Gumbel distribution of ε nit is different from the normal distribution of error term in traditional regression models (e.g., generalized linear models and CD production function) (Elahi et al, 2021;Krueger et al, 2021;Elahi et al, 2022;McFadden, 2022;Yang Y. et al, 2022).…”
Section: Model Specificationmentioning
confidence: 91%
“…For instance, if the EM takes 1000 iterations to converge, the estimation of DLCM involves the estimation of 1000 mixed logit models. For this reason, we rely on the Broyden‐Fletcher–Goldfarb–Shanno (BFGS) algorithm with analytical gradients to optimize these functions in Python (see appendix of Krueger et al, 2021, for analytical gradients).…”
Section: Model Estimationmentioning
confidence: 99%