2003
DOI: 10.1007/3-540-44989-2_136
|View full text |Cite
|
Sign up to set email alerts
|

Neural Network Ensemble with Negatively Correlated Features for Cancer Classification

Abstract: Abstract. The development of microarray technology has supplied a large volume of data to many fields. In particular, it has been applied to prediction and diagnosis of cancer, so that it expectedly helps us to exactly predict and diagnose cancer. It is essential to efficiently analyze DNA microarray data because the amount of DNA microarray data is usually very large. Since accurate classification of cancer is very important issue for treatment of cancer, it is desirable to make a decision by combining the re… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2004
2004
2014
2014

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 6 publications
0
4
0
Order By: Relevance
“…Determining learning algorithms for the sake of producing classifiers and using an ensemble fusion method are the principal tasks in an ensemble classifier (Kuncheva, 2002). Other examples of ensemble classifier can be found in Banfield et al (2007), derWalt and Barnard (2006), Hansen and Salamon (1990), Plewczynski et al (2006), Wang et al (2003) and Won and Cho (2003).…”
Section: Majority Voting With Multiple Learning Algorithmsmentioning
confidence: 98%
“…Determining learning algorithms for the sake of producing classifiers and using an ensemble fusion method are the principal tasks in an ensemble classifier (Kuncheva, 2002). Other examples of ensemble classifier can be found in Banfield et al (2007), derWalt and Barnard (2006), Hansen and Salamon (1990), Plewczynski et al (2006), Wang et al (2003) and Won and Cho (2003).…”
Section: Majority Voting With Multiple Learning Algorithmsmentioning
confidence: 98%
“…A combination of bagging and boosting is used by Dettling in BagBoosting, where in each boosting step a bagged classifier is constructed [68]. Alternatively, resampling of the variables also leads to diverse base classifiers [69][70][71][72]. After construction of the base classifiers, their diversity can be evaluated by comparing their predictions [60,69] or the structure of the individual classifiers [63].…”
Section: Ensemble Classifiersmentioning
confidence: 99%
“…Since not all the genes are associated with a specific disease, the feature selection often called gene selection is necessary to extract informative genes for the classification of the disease [3,15,16]. Moreover, feature selection accelerates the speed of learning a classifier and removes noises in the data.…”
Section: Signal-to-noise Ratio Feature Selectionmentioning
confidence: 99%
“…It is simultaneously done by the training of a classifier to produce the optimal combination of features and a classifier. Since the filter approach is simple and fast enough to obtain high performance, we evaluated various filter-based feature selection methods [15]. Finally signal-to-noise ratio ranking method is adopted to select useful features.…”
Section: Signal-to-noise Ratio Feature Selectionmentioning
confidence: 99%