2019
DOI: 10.1016/j.jare.2019.01.001
|View full text |Cite
|
Sign up to set email alerts
|

Gaussian mixture model for texture characterization with application to brain DTI images

Abstract: Graphical abstract

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
14
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 31 publications
(14 citation statements)
references
References 41 publications
0
14
0
Order By: Relevance
“…These two steps are alternated until a stopping criterion is met, when there is no further change in the assignment of the data points. The k-means algorithm aims at minimizing an objective function, which in this case is the squared error function between µ k and the data point in cluster c k [21,26,27]:…”
Section: Mathematical Approachesmentioning
confidence: 99%
See 1 more Smart Citation
“…These two steps are alternated until a stopping criterion is met, when there is no further change in the assignment of the data points. The k-means algorithm aims at minimizing an objective function, which in this case is the squared error function between µ k and the data point in cluster c k [21,26,27]:…”
Section: Mathematical Approachesmentioning
confidence: 99%
“…For, SC ∈ [0.81-1.00], the object is very well-clustered; SC ∈ [0.51-0.80] the object is well-clustered; SC∈ [0.26-0.50] the object is poor-clustered, and SC ≤ 0.25 occurs with the artificial cluster [26]. Accordingly, when S(i) is close to 1, the 'within' dissimilarity a(i) is much smaller than the smallest 'between' dissimilarity b(i), and the object is 'well-clustered', while for S(i)-0, the a(i), and b(i) are approximately equal; thus, it is unclear as to which cluster the object i is assigned.…”
Section: Mathematical Approachesmentioning
confidence: 99%
“…GMM is a kind of model to describe mixture density distribution, which is widely used in pattern recognition, cluster analysis, and other fields [28]- [30]. It is not limited to a specific form of the probability density function, and any probability density distribution can be approximated by a linear combination of several Gaussian density functions.…”
Section: B Construction Of the Probability Density Function Distribumentioning
confidence: 99%
“…The classes follow a probability distribution (law), normal in the case of Gaussian mixture models (GMM). GMMs [74][75][76][77][78] require few parameters estimated by a simple likelihood function. These parameters can be estimated by adopting the EM algorithm in order to maximize the likelihood function of the log.…”
mentioning
confidence: 99%
“…Gaussian mixing model: GMM suggested by many neuroimaging researchers [76,[168][169][170][171][172] is easy to implement. Effective and robust due to its probabilistic basis.…”
mentioning
confidence: 99%