2020
DOI: 10.18280/ts.370611
|View full text |Cite
|
Sign up to set email alerts
|

Glioma Segmentation and Classification System Based on Proposed Texture Features Extraction Method and Hybrid Ensemble Learning

Abstract: This paper presents an efficient and accurate automated system based on the hybrid XGBoost with Random forest (XGBRF) ensemble model in order to classify the Glioma (type of mostly diagnosed brain tumor) into low grade and high grade Glioma. In this approach initially global thresholding is employed on various MRI sequence and their fusion combinations in order to perform the accurate segmentation. Then uses a proposed Enhanced wavelet binary pattern run length matrix method (EWBPRL) for textural features extr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
10
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 18 publications
(11 citation statements)
references
References 31 publications
1
10
0
Order By: Relevance
“…This technique is a scikit-learn [28] wrapper introduced in the open-source, and still experimental, XGBoost package [46], which implies that the interface can be altered. XGBRF has been used by many studies such as [20,47].…”
Section: Random Forests In Xgboost (Xgbrf)mentioning
confidence: 99%
“…This technique is a scikit-learn [28] wrapper introduced in the open-source, and still experimental, XGBoost package [46], which implies that the interface can be altered. XGBRF has been used by many studies such as [20,47].…”
Section: Random Forests In Xgboost (Xgbrf)mentioning
confidence: 99%
“…Ensemble learning can be used to overcome this problem. Machine learning ensembles are made up of several decision trees known as random forests [32].…”
Section: Experiments 3: Ensemble Network Architecturementioning
confidence: 99%
“…In this experiment, compared with Principal Component Analysis (PCA) [17] and SelectKBest (SKB) [18], we calculate the running time and accuracy under different number of features by using four machine learning algorithms such as DT [19], Extra Tree (ET) [20], XGBoost [21] and eXtreme Gradient Boosting with Random Forest (XGBRF) [22]. For mainly comparing the pros and cons of extracted features on the same algorithm, we do not adjust the parameters of the above four algorithms too much, and most of them are classified using default parameters.…”
Section: Local Controller Defencementioning
confidence: 99%