2014
DOI: 10.14445/22312803/ijctt-v10p107
|View full text |Cite
|
Sign up to set email alerts
|

Ensemble Classifiers and Their Applications: A Review

Abstract: ABSTRACT:Ensemble classifier refers to a group of individual classifiers that are cooperatively trained on data set in a supervised classification problem. In this paper we present a review of commonly used ensemble classifiers in the literature. Some ensemble classifiers are also developed targeting specific applications. We also present some application driven ensemble classifiers in this paper.

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
32
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 50 publications
(32 citation statements)
references
References 46 publications
0
32
0
Order By: Relevance
“…This technique works especially well for unstable learning algorithms whose output classifier undergoes major changes in response to small changes in the training data: Decision tree, neural network, and rule learning algorithms are all unstable, linear regression, nearest neighbor, and linear threshold algorithms are generally very stable. Different network weights are used to train the base neural network learning process [11]. These methods achieve better generalization.…”
Section: Manipulation Of the Training Parametersmentioning
confidence: 99%
See 1 more Smart Citation
“…This technique works especially well for unstable learning algorithms whose output classifier undergoes major changes in response to small changes in the training data: Decision tree, neural network, and rule learning algorithms are all unstable, linear regression, nearest neighbor, and linear threshold algorithms are generally very stable. Different network weights are used to train the base neural network learning process [11]. These methods achieve better generalization.…”
Section: Manipulation Of the Training Parametersmentioning
confidence: 99%
“…In this case, an error is imposed if base classifiers make identical errors on similar patterns [11]. An example of such an ensemble is the Negative correlation learning.…”
Section: Manipulation Of the Error Functionmentioning
confidence: 99%
“…Usually diversity of the base classifiers is achieved by training single learning algorithm on different data sets (bootstrap resampling) [11] [14] [15] [16], different parameters [14], or different features [1] [7] [8]. However, some works achieve diversity by training different learning algorithms on the same data.…”
Section: Related Workmentioning
confidence: 99%
“…Building ensembles of classifiers is a powerful method for obtaining better classification accuracy through combining the classification of multiple classifiers [1]. Boosting [2] [3] and bagging [2] [3] are the two most commonly used methods for building ensembles of homogenous classifiers.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation