2020 International Conference on Electronics and Sustainable Communication Systems (ICESC) 2020
DOI: 10.1109/icesc48915.2020.9155783
|View full text |Cite
|
Sign up to set email alerts
|

Breast Cancer Detection Using K-Nearest Neighbors, Logistic Regression and Ensemble Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 47 publications
(20 citation statements)
references
References 4 publications
0
11
0
Order By: Relevance
“…Support vector machine (SVM), logistic regression (LR), and extreme gradient boosting (XGBoost) had been commonly utilized in related studies of liver lesions, breast lesions, thymomas, and cardiac arrhythmia. 12,[26][27][28] The optimal hyperparameters were determined through a grid search for the 3 classifier methods in the Python scikit-learn environment (version 0.23.2). First, the search space for hyperparameters was created based on the preliminary experimental results.…”
Section: Classifier Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Support vector machine (SVM), logistic regression (LR), and extreme gradient boosting (XGBoost) had been commonly utilized in related studies of liver lesions, breast lesions, thymomas, and cardiac arrhythmia. 12,[26][27][28] The optimal hyperparameters were determined through a grid search for the 3 classifier methods in the Python scikit-learn environment (version 0.23.2). First, the search space for hyperparameters was created based on the preliminary experimental results.…”
Section: Classifier Methodsmentioning
confidence: 99%
“…RFE was used by many studies as a feature selection method because of its stability and effectiveness, 19,[21][22][23] it recursively trimmed the feature set according to current importance until it contained the required number of features. RF and chisquare tests had exhibited efficacy in some studies, [24][25][26][27] the former utilized the out-of-bag error of each decision tree to calculate the importance of each feature and sorted them in order of importance; the later used the correlation between the label and each feature to obtain the chisquare value of each feature and then sorted these values from large to small. Three feature selections were implemented in the Python scikit-learn environment (version 0.23.2).…”
Section: Radiomics Features Selectionmentioning
confidence: 99%
“…Features are extracted using PCA and transferred to the retraining phase. The model uses a small amount of data and achieves better accuracy and the accuracy could become better by using a large amount of data [49]. [50] presented a knowledge-based model to classify the cancer.…”
Section: Related Workmentioning
confidence: 99%
“…In essence, logistic regression is a regression model that predicts that a given data item or entry is likely to belong to a given class using the regression model. Logistic regression uses a sigmoid function to model the data, as illustrated in Figure 7 [63,64,65]. Logistic regression has many important points, such as implementation simplicity, computational effectiveness, training-based effectiveness, ease of regularization.…”
Section: G Logistic Regressionmentioning
confidence: 99%