2019
DOI: 10.1007/s00500-019-04567-2
|View full text |Cite
|
Sign up to set email alerts
|

A new hybrid discriminative/generative model using the full-covariance multivariate generalized Gaussian mixture models

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
20
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
8

Relationship

4
4

Authors

Journals

citations
Cited by 14 publications
(20 citation statements)
references
References 36 publications
0
20
0
Order By: Relevance
“…Actually, many classifiers such as SVM show great potential compared to generative models for several applications [ 30 , 31 ]. However, in most applications, the conventional SVMs kernels (i.e., linear, polynomial, RBF) [ 30 ] are not able to consider the nature of the data and it was noted that choosing conventional SVM kernels was not the right choice [ 14 ]. This disadvantage limits their performance.…”
Section: Discriminative Learning Approach Based On Ssdmmmentioning
confidence: 99%
See 1 more Smart Citation
“…Actually, many classifiers such as SVM show great potential compared to generative models for several applications [ 30 , 31 ]. However, in most applications, the conventional SVMs kernels (i.e., linear, polynomial, RBF) [ 30 ] are not able to consider the nature of the data and it was noted that choosing conventional SVM kernels was not the right choice [ 14 ]. This disadvantage limits their performance.…”
Section: Discriminative Learning Approach Based On Ssdmmmentioning
confidence: 99%
“…It is noted that while the data follows a non-Gaussian distribution in nature, the Gaussian model can give a weak performance. Noticing this fact, various mixture models have been proposed in the literature and some distinguished ones are based on the generalized Gaussian [ 14 ], Dirichlet and generalized Dirichlet [ 15 ], Beta-Liouville [ 15 ], t-student distributions, etc. For instance, the Dirichlet mixture and its extensions (like generalized Dirichlet) have been successfully employed and can often outperform the Gaussian model for data clustering, categorization and action recognition [ 9 , 16 , 17 , 18 , 18 ].…”
Section: Introductionmentioning
confidence: 99%
“…It may be important to consider the nature of the observed data when determining a particular probability distribution. For several decades Gaussian distribution has been applied as the basic distribution to solve many data analysis problems [8], however, recently several works have proved that it is not the best approximation especially when treating non-Gaussian data [9] (i.e. with restrictions of non-negativity and unit-sum).…”
Section: Introductionmentioning
confidence: 99%
“…To take advantage of both approaches, in this paper, we consider a hybrid generativediscriminative learning approach. Hybrid framework can be seen as the process of integration of prior knowledge related to the current classification into the training operation to get the highest accuracy [8], [22]- [24]. This paper focuses on increasing the classification performance by introducing a new hybrid learning approach that takes into account the benefits of both discriminative SVM kernels and generative IBLMM mixture.…”
Section: Introductionmentioning
confidence: 99%
“…A difficult aspect when considering finite mixture models is the determining of the exact component numbers, which help to avoid the issues of over-and under-fitting and minimizing the approximation errors, especially when we are trying to model complex real-world data problems (such as multimodal data) [9]. For instance, Laplace and Normal densities fail to fit many complex shapes with multi-dimensional data and when describing the heavier tails caused by specific patterns [10,11]. In this context, non-Gaussian data modelling plays an essential role in accurate data clustering and classification.…”
Section: Introduction and Related Workmentioning
confidence: 99%