2010
DOI: 10.1007/s10916-010-9518-8
|View full text |Cite
|
Sign up to set email alerts
|

Diagnosing Breast Masses in Digital Mammography Using Feature Selection and Ensemble Methods

Abstract: Methods that can accurately predict breast cancer are greatly needed and good prediction techniques can help to predict breast cancer more accurately. In this study, we used two feature selection methods, forward selection (FS) and backward selection (BS), to remove irrelevant features for improving the results of breast cancer prediction. The results show that feature reduction is useful for improving the predictive accuracy and density is irrelevant feature in the dataset where the data had been identified o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0
1

Year Published

2014
2014
2022
2022

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 62 publications
(23 citation statements)
references
References 44 publications
0
22
0
1
Order By: Relevance
“…The methods compared are the following: (i) the multilayer perceptron ensemble (MLPE) method proposed in [57]; (ii) a boosted neural network (BoostNN) classifier in [58]; (iii) a decision tree (DT) and support vector machine sequential minimal optimization (SVM-SMO) based ensemble classifier proposed by Luo and Cheng [59]. The results are listed in Table 8.…”
Section: Evaluation Of Kernel Pca Ensemblementioning
confidence: 99%
“…The methods compared are the following: (i) the multilayer perceptron ensemble (MLPE) method proposed in [57]; (ii) a boosted neural network (BoostNN) classifier in [58]; (iii) a decision tree (DT) and support vector machine sequential minimal optimization (SVM-SMO) based ensemble classifier proposed by Luo and Cheng [59]. The results are listed in Table 8.…”
Section: Evaluation Of Kernel Pca Ensemblementioning
confidence: 99%
“…The obtained accuracy of the proposed method, 97%, is comparable with KNN (96%), SVM (87%) and Naive Bayes (89%). In this paper, by assembling three classifiers and applying single voting policy, we improve the classification results in comparison to the method proposed by Luo et al (13) The obtained results show that our method has a slight improvement over the other proposed methods on the dataset, which is publicly available. Therefore, the proposed method is more reliable in order to assist the radiologist in the detection of abnormal data and to improve the diagnostic accuracy.…”
Section: Discussionmentioning
confidence: 87%
“…Prathibha et al (12) used the Sequential Floating Forward Selection (SFFS) to reduce the feature dimensionality. Luo et al (13) used two well-known feature selection techniques, including forward selection and backward selection, and two classifiers for ensemble classification. They have used a decision tree and supper vector machine, as an initial classifier.…”
Section: Introductionmentioning
confidence: 99%
“…Ensemble methods in machine learning aim to induce a collection of diverse predictors which are both accurate and complementary, so that, better prediction accuracy on previously unseen data is obtained when the decisions of different learners are combined (81).…”
Section: Ensemble Methodsmentioning
confidence: 99%
“… Bagging Also known as bootstrap aggregating, it is usually applied to decision trees classifiers, and the improvement of generalization errors is made by reducing the variance of the base classifier (81). Bagging applies the learning scheme to each one of artificially derived dataset, and the classifiers generated from them vote for the class to be predicted (35).…”
Section: Ensemble Methodsmentioning
confidence: 99%