2012
DOI: 10.1016/j.eij.2012.03.002
|View full text |Cite
|
Sign up to set email alerts
|

Unsupervised learning of mixture models based on swarm intelligence and neural networks with optimal completion using incomplete data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2013
2013
2017
2017

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 17 publications
0
3
0
Order By: Relevance
“…Ghahramani and Jordan dealt with incomplete data via the EM approach from a supervised learning perspective . Abas combined local tuning of the general regression, particle swarm optimization (PSO) with EM to handle incomplete data set with missing values . Lin and Su proposed the EM and Bayesian classifier hybrid algorithm to solve the problem of incomplete data in feature extraction .…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Ghahramani and Jordan dealt with incomplete data via the EM approach from a supervised learning perspective . Abas combined local tuning of the general regression, particle swarm optimization (PSO) with EM to handle incomplete data set with missing values . Lin and Su proposed the EM and Bayesian classifier hybrid algorithm to solve the problem of incomplete data in feature extraction .…”
Section: Introductionmentioning
confidence: 99%
“…15 Abas combined local tuning of the general regression, particle swarm optimization (PSO) with EM to handle incomplete data set with missing values. 16,17 Lin and Su proposed the EM and Bayesian classifier hybrid algorithm to solve the problem of incomplete data in feature extraction. 18 Wang et al imputed missing data based on the EM algorithm, and then solved the nonparametric spectral analysis.…”
Section: Introductionmentioning
confidence: 99%
“…environments [131][132][133][134][135][136]. There exist various clustering techniques including K-means [137], mixture models [138], hierarchical clustering [139], self-organizing map [103], and adaptive resonance theory [140].…”
Section: Resultsmentioning
confidence: 99%