2013
DOI: 10.1007/s13721-013-0034-x
|View full text |Cite
|
Sign up to set email alerts
|

Classification of microarray cancer data using ensemble approach

Abstract: An ensemble of classifiers is created by combining predictions of multiple component classifiers for improving prediction performance. In this paper, we conduct experimental comparison of J48, NB, IBK on nine microarray cancer datasets and also analyze their performance with Bagging, Boosting and Stack Generalization. The experimental results show that all ensemble methods outperform the individual classification methods. We then present a method, referred to as SD-EnClass, for combining classifiers from diffe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
36
0

Year Published

2014
2014
2021
2021

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 64 publications
(39 citation statements)
references
References 18 publications
0
36
0
Order By: Relevance
“…Then, the classifier with the highest class performance for a certain class out of the base classifiers becomes the expert of that class. The class specific performance of a classifier is calculated as: [11] Class specific accuracy= (Total no. of correctly predicted instances for a class)/ (total no.…”
Section: Apply Classifiers and Combine Results In Order To Make A Decimentioning
confidence: 99%
See 2 more Smart Citations
“…Then, the classifier with the highest class performance for a certain class out of the base classifiers becomes the expert of that class. The class specific performance of a classifier is calculated as: [11] Class specific accuracy= (Total no. of correctly predicted instances for a class)/ (total no.…”
Section: Apply Classifiers and Combine Results In Order To Make A Decimentioning
confidence: 99%
“…Canedo et al [10] described a new framework for feature selection consisting of an ensemble of filters and classifiers. And also Nagi et al [11] combined the results of Boosting, Bagging and Stacking to obtain results which are significantly better than using Boosting, Bagging or Stacking alone. Liu et al [12] proposed a new ensemble gene selection method in which each gene subset is obtained by the same gene selector with different starting point.…”
Section: Existing Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Bolón-Canedo et al provided a novel framework for feature selection by an ensemble of filters and classifiers [10]. Combining classifiers from different classification families into an ensemble based on the evaluation of performance of each classifier, Nagi and Bhattacharyya proposed an ensemble method named as SD-EnClass [11]. To ensure a high classification accuracy, Ghorai et al showed an ensemble of nonparallel plane proximal classifiers based on the genetic algorithm through simultaneous feature and model selection scheme [12].…”
Section: Introductionmentioning
confidence: 99%
“…The embedded techniques (Wahid et al, 2011[32]) allow interaction of different class of learning algorithms. More recently, the ensemble model (Nagi and Bhattacharyya, 2013[25]) based on different sub sampling strategies, the learning algorithms run on a number of sub samples and the acquired features are united into a stable subset. However the feature selection techniques can be also categorized based on search strategies used such as forward selection, backward elimination, forward stepwise selection, backward stepwise selection and random mutation (Mladeni, 2006[24]).…”
Section: Introductionmentioning
confidence: 99%