2018
DOI: 10.1049/iet-ipr.2018.0043
|View full text |Cite
|
Sign up to set email alerts
|

Entropy‐based variational Bayes learning framework for data clustering

Abstract: A novel framework is developed for the modelling and clustering of proportional data (i.e. normalised histograms) based on the Beta‐Liouville mixture model. This framework is based on incremental model selection, by testing if a given component was truly Beta‐Liouville distributed. Specifically, the authors compare the theoretical maximum entropy of the given component with the estimated entropy obtained by the MeanNN estimator. If a significant difference was gained from this comparison, this component is con… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
2
1

Relationship

3
5

Authors

Journals

citations
Cited by 13 publications
(5 citation statements)
references
References 33 publications
0
5
0
Order By: Relevance
“…Moreover, the model complexity can be easily solved using for example the marginal likelihood-based technique. Thus, our focus in this paper is to implement an effective Bayesian learning method for SSDMM in order to take into account the complexity of medical data and to overcome the drawbacks of frequentist (deterministic) approaches [ 34 , 35 ]. To the best of our knowledge, such an approach has never been tackled before, especially for the problem of chest x-ray images classification.…”
Section: Motivationsmentioning
confidence: 99%
“…Moreover, the model complexity can be easily solved using for example the marginal likelihood-based technique. Thus, our focus in this paper is to implement an effective Bayesian learning method for SSDMM in order to take into account the complexity of medical data and to overcome the drawbacks of frequentist (deterministic) approaches [ 34 , 35 ]. To the best of our knowledge, such an approach has never been tackled before, especially for the problem of chest x-ray images classification.…”
Section: Motivationsmentioning
confidence: 99%
“…For the parameter estimation problem, we focus, here, on the application of variational Bayes with the mean field approximation, which has been shown to be an efficient technique for inferring posterior distributions of mixture models [ 20 , 25 , 26 ]. Indeed, variational Bayes has been proposed as an efficient solution for posteriors approximation with low computational cost, as opposed to other inference approaches such as the MCMC technique [ 8 , 27 ].…”
Section: Variational Bayesian Learning Via Entropy-based Splittingmentioning
confidence: 99%
“…Finally, the solutions of the updated variational posteriors are obtained by optimizing with respect to each distribution. The resulting solutions are expressed as follows: where the hyperparameters in the above equations can be fixed in a similar way as in [ 26 ] by testing and experimenting different values depending on the data set to model .…”
Section: Variational Bayesian Learning Via Entropy-based Splittingmentioning
confidence: 99%
“…The third developed learning algorithm is called variational Bayesian inference, which is an efficient alternative to the previous Bayesian learning method [8,33,34]. In this work, we tackle the case of a multi-dimensional problem for variational Bayesian learning to classify biomedical images.…”
Section: Variational Learningmentioning
confidence: 99%
“…In the statistical learning context, we can identify different learning approaches for making inference on parameters in mixture models; in particular, deterministic inference via maximum likelihood (ML) or maximum a posteriori (MAP) and non-deterministic inference via Bayesian inference or variational inference [8,9]. Our work here is motivated by the interesting results obtained with the finite Gamma mixture in the case of data clustering and segmentation.…”
Section: Introductionmentioning
confidence: 99%