2016
DOI: 10.1016/j.compag.2016.03.026
|View full text |Cite
|
Sign up to set email alerts
|

A hybrid ensemble for classification in multiclass datasets: An application to oilseed disease dataset

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
16
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 56 publications
(16 citation statements)
references
References 35 publications
0
16
0
Order By: Relevance
“…In this work, Naïve Bayesian is used as a classifier. This technique is useful because it can calculate the [55,56]. 12.51% (635 patients), 7.49% (380 patients) and 8.75% (444 patients) of the patients under study, were placed in the levels of BI-RADS 0 to BI-RADS V I respectively.…”
Section: Naïve Bayesian (Nb)mentioning
confidence: 99%
“…In this work, Naïve Bayesian is used as a classifier. This technique is useful because it can calculate the [55,56]. 12.51% (635 patients), 7.49% (380 patients) and 8.75% (444 patients) of the patients under study, were placed in the levels of BI-RADS 0 to BI-RADS V I respectively.…”
Section: Naïve Bayesian (Nb)mentioning
confidence: 99%
“…For example, stacked generalisation, classifier ensembles, hybrid methods and mixture of experts are some of the common ones. Previous works have shown that different combination of classifiers are employed to improve the performance of constituent classifiers [17][18][19]. However, in this work, these different terminologies about the classifier combination and the significant of information fusion are unified by adopting the following point of view, which is to search for the best set of classifiers by fusing the outputs of individual classifiers without considering the nature of feature set and pattern representation of input data.…”
Section: Related Workmentioning
confidence: 99%
“…There is often a problem with multi-class dataset data due to improper classification of results due to the large collection and distribution of class results. From the increasing number of multiclass data sets Therefore, new machine learning algorithms [11][12][13] need to be developed to improve the efficiency of the results class prediction and find that there are multiple learning models that can be used to solve the same problem [14][15][16]. Ensemble learning is a machine learning process to improve the efficiency of predictions [17][18][19]., [20] using a strategy to combine multiple learning predictions and helps to reduce the problem of inappropriate model selection by combining all models.…”
Section: Introductionmentioning
confidence: 99%
“…Ensemble learning is a machine learning process to improve the efficiency of predictions [17][18][19]., [20] using a strategy to combine multiple learning predictions and helps to reduce the problem of inappropriate model selection by combining all models. This method is popular and widely used to improve performance than individual models [13,21,22]. According to the effectiveness of the ensemble learning method, it is necessary to create Creating a new ensemble model to improve the model's accuracy and stability [16,20,23].…”
Section: Introductionmentioning
confidence: 99%