2022
DOI: 10.1007/s10489-021-03033-7
|View full text |Cite|
|
Sign up to set email alerts
|

RETRACTED ARTICLE: Three-way selection random forest algorithm based on decision boundary entropy

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 40 publications
0
5
0
Order By: Relevance
“…There is uncertainty in the data information, and information entropy and approximate classification quality are measures of uncertain information. As a result, the attribute importance defined by combining information entropy, attribute dependence, and approximate classification quality is used to measure the attribute [13]. We improved the three-way attribute random selection algorithm in [13] to obtain a new reduction method, thus achieving the purpose of dimension reduction.…”
Section: Attribute Reductionmentioning
confidence: 99%
See 2 more Smart Citations
“…There is uncertainty in the data information, and information entropy and approximate classification quality are measures of uncertain information. As a result, the attribute importance defined by combining information entropy, attribute dependence, and approximate classification quality is used to measure the attribute [13]. We improved the three-way attribute random selection algorithm in [13] to obtain a new reduction method, thus achieving the purpose of dimension reduction.…”
Section: Attribute Reductionmentioning
confidence: 99%
“…Calculate g(c) When determining the number of attributes, if the number of attributes is too much, the effect of dimension reduction cannot be achieved; if the number of attributes is too small, the model accuracy may be affected. In reference [13], the authors take √ p attributes to train a subtree and obtain a random forest. If it is obviously not appropriate to continue to select √ p attributes, here, half of the original attributes are selected as the number of attributes after attribute selection.…”
Section: Three-way Attribute Reductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Zhang et al proved the most direct simplified version of the original RF, that is, when selecting split features and split values, it randomly selects one of all features as split feature, and at the same time, it randomly selects one value from the selected feature value as split value [16]. Bao ey connected RF to the representation learning process of the deep neural network and turned the split function into a random decision function, so that it could update the parameters of the whole network through backpropagation [18]. He et al also proposed a depth structure based entirely on RF [19].…”
Section: Research Status Of Rf Algorithmmentioning
confidence: 99%
“…However, the inherent randomness in the attribute selection process during the construction of a random forest can result in suboptimal decision tree performance. To address this issue, document [11] proposes a novel approach based on decision boundary entropy, which selects three branches for each attribute during decision tree construction. This method improves the accuracy of individual decision trees and subsequently, the overall classification performance of the forest.…”
Section: Introductionmentioning
confidence: 99%