2020
DOI: 10.1142/s0219622020500236
|View full text |Cite
|
Sign up to set email alerts
|

A New Adaptive Weighted Deep Forest and Its Modifications

Abstract: A new adaptive weighted deep forest algorithm which can be viewed as a modification of the confidence screening mechanism is proposed. The main idea underlying the algorithm is based on adaptive weigting of every training instance at each cascade level of the deep forest. The confidence screening mechanism for the deep forest proposed by Pang et al., strictly removes instances from training and testing processes to simplify the whole algorithm in accordance with the obtained random forest class probability dis… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
8
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 12 publications
(9 citation statements)
references
References 40 publications
1
8
0
Order By: Relevance
“…The case opt = 1 means that weights of trees are totally determined by the tree results and do not depend on each instance. This case coincides with the weighted RF proposed in [30]. The case opt = 0 means that weights of trees are determined only by the softmax function (with or without trainable parameters).…”
Section: Regressionsupporting
confidence: 73%
See 2 more Smart Citations
“…The case opt = 1 means that weights of trees are totally determined by the tree results and do not depend on each instance. This case coincides with the weighted RF proposed in [30]. The case opt = 0 means that weights of trees are determined only by the softmax function (with or without trainable parameters).…”
Section: Regressionsupporting
confidence: 73%
“…However, the assigned weights in the aforementioned works are not trainable parameters. Attempts to train weights of trees were carried out in [30,31,7,8], where weights are assigned by solving optimization problems, i.e., they incorporated into a certain loss function of the whole RF such that the loss function is minimized over values of weights.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…AWDF: AWDF [20] is an improved deep forest method, which adopts the adaptive weight of every training instance at each cascade level.…”
Section: Baseline Methods and Reaserch Questionsmentioning
confidence: 99%
“…Yang et al [19] proposed a multi-label learning deep forest technique, which employed measure-aware feature reuse and layer growth to solve a multi-label learning problem. Utkin et al [20] proposed a novel model called adaptive weighted deep forest, in which each training instance was given a weight at each cascade level of the model. Recently, gcForest has been applied in the classification of schizophrenia data [21], disease classification [22] and cancer detection [23,24,25].…”
Section: Introductionmentioning
confidence: 99%