2019
DOI: 10.1142/s0218001420500068
|View full text |Cite
|
Sign up to set email alerts
|

Improving Accuracy of Evolving GMM Under GPGPU-Friendly Block-Evolutionary Pattern

Abstract: As a classical clustering model, Gaussian Mixture Model (GMM) can be the footstone of dominant machine learning methods like transfer learning. Evolving GMM is an approximation to the classical GMM under time-critical or memory-critical application scenarios. Such applications often have constraints on time-to-answer or high data volume, and raise high computation demand. A prominent approach to address the demand is GPGPU-powered computing. However, the existing evolving GMM algorithms are confronted with a d… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 19 publications
0
2
0
Order By: Relevance
“…where, L denotes the model likelihood; 𝑣 denotes the degree of freedom of the model parameters; and 𝑁 denotes the number of training data points. The model with the lowest BIC value is selected because it maximizes the log-likelihood [6]. Algorithm 1 presents the steps of the learning process.…”
Section: Model Trainingmentioning
confidence: 99%
See 1 more Smart Citation
“…where, L denotes the model likelihood; 𝑣 denotes the degree of freedom of the model parameters; and 𝑁 denotes the number of training data points. The model with the lowest BIC value is selected because it maximizes the log-likelihood [6]. Algorithm 1 presents the steps of the learning process.…”
Section: Model Trainingmentioning
confidence: 99%
“…Moreover, the decision for the frequency of retraining and redeployment is a difficult task. Another promising approach is to use an evolving or incremental learning method [4] [5], where the model is updated when a new subset of data arrives [6]. Each iteration is considered as an incremental step toward revisiting the current model.…”
Section: Introductionmentioning
confidence: 99%