2016
DOI: 10.1080/10705511.2016.1250637
|View full text |Cite
|
Sign up to set email alerts
|

A Comparison of Methods for Uncovering Sample Heterogeneity: Structural Equation Model Trees and Finite Mixture Models

Abstract: Although finite mixture models have received considerable attention, particularly in the social and behavioral sciences, an alternative method for creating homogeneous groups, structural equation model trees (Brandmaier, von Oertzen, McArdle, & Lindenberger, 2013), is a recent development that has received much less application and consideration. It is our aim to compare and contrast these methods for uncovering sample heterogeneity. We illustrate the use of these methods with longitudinal reading achievement … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
27
0
1

Year Published

2018
2018
2021
2021

Publication Types

Select...
4
1

Relationship

2
3

Authors

Journals

citations
Cited by 24 publications
(28 citation statements)
references
References 38 publications
0
27
0
1
Order By: Relevance
“…The model fit poorly using the entire sample (all participants endorsing past 12‐month NSSI), with a root‐mean‐squared error of approximation (RMSEA) of 0.141 (90% confidence interval [CI] = 0.108, 0.176). Note that some degree of misfit is necessary for the identification of subgroups (see Jacobucci et al () for further detail). See Figure for the resulting SEM tree diagram.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…The model fit poorly using the entire sample (all participants endorsing past 12‐month NSSI), with a root‐mean‐squared error of approximation (RMSEA) of 0.141 (90% confidence interval [CI] = 0.108, 0.176). Note that some degree of misfit is necessary for the identification of subgroups (see Jacobucci et al () for further detail). See Figure for the resulting SEM tree diagram.…”
Section: Resultsmentioning
confidence: 99%
“…Structural equation model trees (SEM trees; Brandmaier, von Oertzen, McArdle, & Lindenberger, ; Jacobucci, Grimm, & McArdle, ) are a generalization of DT that build tree structures that separate a data set recursively into subsets that are maximally different with respect to the fit of an SEM. By splitting observations into groups based on their predictor values, SEM trees can be thought of as a form of exploratory multiple group modeling that results in a tree‐like structure indicating final group membership.…”
Section: Methodsmentioning
confidence: 99%
“…For example, larger values of the number of true nodes C, which is also a newly specified factor in the current simulation, have a negative impact on correct estimation of the number of true classes, because when C = 4, statistical power for correct estimation of classes dramatically decreases compared with the case of C = 2. This point is critically important, since in many applications of tree analyses there is more than one split, extracting more than two classes (e.g., Brandmaier et al 2013;Jacobucci et al 2017). Therefore, the previous investigations might show overly optimistic results regarding the performance of SEM Trees in general.…”
Section: Model Selection Proceduresmentioning
confidence: 82%
“…During the past decade, researchers have shown growing interest in applying latent growth curve mixture models (LGCMMs; Berlin et al 2014;Leiby et al 2009;Neelon et al 2011;Ram and Grimm 2009), and, more recently, in applying machine learning techniques including SEM Trees (e.g., Hayes et al 2015;Martin 2015;Jacobucci et al 2017). Both LGCMMs and SEM Trees utilize SEM to model changes using latent variables estimated with a smaller number of parameters.…”
mentioning
confidence: 99%
See 1 more Smart Citation