2003
DOI: 10.1002/cem.816
|View full text |Cite
|
Sign up to set email alerts
|

Induction of decision trees using fuzzy partitions

Abstract: A new method for the induction of fuzzy decision trees is introduced. The fuzzy decision tree classifier improves prediction accuracy using smaller models by locating more robust splitting regions. The proposed method also provides a measure of confidence for sample classification by propagating partition memberships into all leaf nodes, thereby relaxing local subspace restrictions. The fuzzy decision tree algorithm is presented and compared against standard and bagged decision tree classifiers.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2004
2004
2020
2020

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(6 citation statements)
references
References 6 publications
0
6
0
Order By: Relevance
“…These datasets are those used in the studies (Choi and Moon 2007;Li 2009;Myles and Brown 2003). We compare OFP_CLASS results with the ones obtained by the proposed methods in (Choi and Moon 2007;Li 2009;Myles and Brown 2003). All numeric attributes of these datasets are initially without discretizing.…”
Section: Methodsmentioning
confidence: 98%
See 1 more Smart Citation
“…These datasets are those used in the studies (Choi and Moon 2007;Li 2009;Myles and Brown 2003). We compare OFP_CLASS results with the ones obtained by the proposed methods in (Choi and Moon 2007;Li 2009;Myles and Brown 2003). All numeric attributes of these datasets are initially without discretizing.…”
Section: Methodsmentioning
confidence: 98%
“…• NFDT: In (Myles and Brown 2003), Myles and Brown introduce a method for the induction of fuzzy decision trees that determines the location and the associated uncertainty for each decision boundary during the construction process. In this process a fuzzy partition of attributes is generated that provides a confidence estimate of classification through membership propagation.…”
Section: Methodsmentioning
confidence: 99%
“…They are solved using a training set of samples, which are known a priori to belong to particular classes. The methods for unsupervised classification are mainly based on PCA decomposition followed by analysis of distances between classes, 136 construction of dendrograms, the use of the fuzzy set, 137 etc. Procrustean rotation 138 and Mahalanobis distance 139 ± 141 have been used for these purposes.…”
Section: Classification and Discriminationmentioning
confidence: 99%
“…In order to address this as a specific end goal, different learning techniques have been reported in literature for the generation of fuzzy set automatically. These techniques include decision tree [2,3,4], clustering [5,6], hybrid models [7,8,9] and evolutionary algorithms [10,11,12]. The presence of 3D-MF in T2FLS necessitates the adjustment of more parameters than T1FLS, which makes the learning process more complicated [13].…”
Section: Introductionmentioning
confidence: 99%