2013
DOI: 10.1080/03610918.2012.707455
|View full text |Cite
|
Sign up to set email alerts
|

Trajectory Modeling of Longitudinal Binary Data: Application of the EM Algorithm for Mixture Models

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
6
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(6 citation statements)
references
References 19 publications
0
6
0
Order By: Relevance
“…For non-continuous data, either longitudinal or multi-dimensional, the classification is often performed by mixture modeling. [15][16][17][18] In the case of trajectory classification, each individual trajectory is modeled by a mixture of a finite number of polynomials or spline functions, the mixing proportions varying from one individual to another. Some methods assume that there is no intra-group heterogeneity (e.g.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…For non-continuous data, either longitudinal or multi-dimensional, the classification is often performed by mixture modeling. [15][16][17][18] In the case of trajectory classification, each individual trajectory is modeled by a mixture of a finite number of polynomials or spline functions, the mixing proportions varying from one individual to another. Some methods assume that there is no intra-group heterogeneity (e.g.…”
Section: Introductionmentioning
confidence: 99%
“…For example: (i) the detection of prostate cancer recurrence relies on prostate specific antigen (PSA) levels which are longitudinally monitored after the first radiotherapy; 1 (ii) the detection of osteoporosis by regular measures of bone mineral density; (iii) the analysis of the long-term immune recovery of HIV-infected and treated patients through recurrent CD4 cell counts; 2 (iv) the identification of profiles of juvenile delinquency by the use of teacher reports of physical aggression by pupils aged [6][7][8][9][10][11][12][13][14][15].…”
Section: Introductionmentioning
confidence: 99%
“…In our analysis, we applied the EM algorithm for double penalized likelihood estimation as used by Lee et al 36 and Chu and Koval. 54 In the semiparametric multilevel regression models, estimation of parameters procedure is simplified and speeded up using the hybrid algorithms such as combining the EM algorithm with Newton-Raphson method. Alternatively, there are numerical integration approximations such as Gauss-Hermite quadrature for evaluating the integral over the random effects distribution for single-level zero-inflated models.…”
Section: Discussionmentioning
confidence: 99%
“…Since the fitting process involves maximizing a likelihood function, GBTM may converge slowly as the number of time points for each time series or the total number of time series grows, or it may even fail to converge when using higher orders of polynomial functions. 19–21 …”
Section: Introductionmentioning
confidence: 99%
“…Since the fitting process involves maximizing a likelihood function, GBTM may converge slowly as the number of time points for each time series or the total number of time series grows, or it may even fail to converge when using higher orders of polynomial functions. [19][20][21] Recent advances in machine learning provide more statistical tools to understand and characterize time series. Methods such as recurrent neural networks (RNN) [22][23][24][25] and long-and short-term memory auto-encoders followed by K-means clustering 26 Without assuming any specific form of the exposure patterns, we explicitly define a list of features summarizing individual-level prescription data.…”
mentioning
confidence: 99%