2021
DOI: 10.17010/ijcs/2021/v6/i6/167641
|View full text |Cite
|
Sign up to set email alerts
|

Analysis of Depth of Entropy and GINI Index Based Decision Trees for Predicting Diabetes

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 0 publications
0
3
0
Order By: Relevance
“…DT learning algorithm depends on picking up the best-split point on each node. The process of splitting utilizes the concept of Entropy and Information Gain [23], [24] and provides the best data splitting. Information theory inspires entropy, determining the sample values' impurity.…”
Section: B Decision Treementioning
confidence: 99%
“…DT learning algorithm depends on picking up the best-split point on each node. The process of splitting utilizes the concept of Entropy and Information Gain [23], [24] and provides the best data splitting. Information theory inspires entropy, determining the sample values' impurity.…”
Section: B Decision Treementioning
confidence: 99%
“…They provide feature importance as part of their output [33]. The importance is calculated based on how much each feature reduces the impurity in the tree [34]. The measure of impurity is either the Gini impurity or the information Gain/entropy [34].…”
Section: Features Selectionmentioning
confidence: 99%
“…This is calculated by comparing the entropy of the dataset before and after splitting it based on the feature of interest. Entropy, on the other hand, is a measure of the degree of impurity or uncertainty in a dataset [31]. It measures the homogeneity of samples within a class.…”
Section: Random Forest -A Supervised Machine Learningmentioning
confidence: 99%