2023
DOI: 10.1016/j.eswa.2022.118883
|View full text |Cite
|
Sign up to set email alerts
|

Cautious weighted random forests

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
3

Relationship

1
5

Authors

Journals

citations
Cited by 14 publications
(10 citation statements)
references
References 39 publications
0
6
0
Order By: Relevance
“…SVM is a robust algorithm with mature theory and high efficiency, due to which it has a good application effect in multiple fields 34 , 35 . On the other hand, RF is known for reducing variance error and can be trained in parallel to improve computational efficiency 36 . These robust and multi-dimensional aspects of CART, RF and SVM make them diverse base learners.…”
Section: Proposed Methodologymentioning
confidence: 99%
“…SVM is a robust algorithm with mature theory and high efficiency, due to which it has a good application effect in multiple fields 34 , 35 . On the other hand, RF is known for reducing variance error and can be trained in parallel to improve computational efficiency 36 . These robust and multi-dimensional aspects of CART, RF and SVM make them diverse base learners.…”
Section: Proposed Methodologymentioning
confidence: 99%
“…The procedure described in Alg. 3, hereafter referred to as CDM Vote (standing for "cautious decision-making via voting"), extends voting when votes are expressed as subsets of classes and returns the subset A ⋆ = arg max EU (A) among all subsets A ⊆ Ω such that |A| ≤ M ≤ K. It generalizes the method proposed in [24,25] for binary cautious classification. It is computationally less efficient than CDM Ave, even if time complexity can be controlled, as it will be shown in the experimental part.…”
Section: Generalization Of Votingmentioning
confidence: 99%
“…In [24,25], we have proposed a generalized voting aggregation strategy for binary cautious classification within the belief function framework. In the present paper, we generalize these previous works in the multi-class case.…”
Section: Introductionmentioning
confidence: 99%
“…These bounds are obtained using the Imprecise Dirichlet Model, and reflect the estimation uncertainty due to the lack of training data. These intervals can be pooled using the theory of belief functions, by computing the belief and plausibility 𝑏𝑒𝑙 (𝑌 = 1|𝑥) and 𝑝𝑙 (𝑌 = 1|𝑥), which can then be used in a cautious decision-making process such as interval dominance, possibly resulting in indeterminate decisions (Zhang et al, 2021).…”
Section: Cautious Random Forestsmentioning
confidence: 99%
“…However, when training data are scarce, or when mistakes have a very high cost, cautious classifiers can alternatively be used to provide set-valued decisions rather than single classes and thus control the risk. Cautious random forests (CRF) (Zhang et al, 2021) are one of those classifiers. A CRF combines the classical random forest (RF) strategy (Breiman, 2001), the Imprecise Dirichlet Model (IDM) (Walley, 1996) and the theory of belief functions (Shafer, 1976).…”
Section: Introductionmentioning
confidence: 99%