2016
DOI: 10.1016/j.ins.2016.03.022
|View full text |Cite
|
Sign up to set email alerts
|

Time-constrained cost-sensitive decision tree induction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 22 publications
(8 citation statements)
references
References 23 publications
0
8
0
Order By: Relevance
“…The test cost and test time for the variables in both the Pima Indians Diabetes and the Car datasets are set randomly between 1 and 100, as shown in Tables 4 and 5, respectively. The time constraint maxtime is estimated by the method described in Chen et al [5] and set as 300, 349, and 150 for the Heart Disease, diabetes, and Car data, respectively.…”
Section: Datasetsmentioning
confidence: 99%
See 1 more Smart Citation
“…The test cost and test time for the variables in both the Pima Indians Diabetes and the Car datasets are set randomly between 1 and 100, as shown in Tables 4 and 5, respectively. The time constraint maxtime is estimated by the method described in Chen et al [5] and set as 300, 349, and 150 for the Heart Disease, diabetes, and Car data, respectively.…”
Section: Datasetsmentioning
confidence: 99%
“…Wan [48] proposes a multi-batch strategy. Kao and Tang [17] introduce the cost-sensitive-with-late-constraint (CSLC), and Chen, Wu, and Tang [5] present a time-constrained minimal-cost decision tree (henceforth, CWT algorithm), showing satisfactory performance. Wu, Chen, and Tang [50] extend CWT with the Cost-Sensitive Associative Tree (CAT) for multiple resource constraints.…”
Section: Introductionmentioning
confidence: 99%
“…Because the coefficients in the high-frequency wavelet subbands are sparse, we can divide the subbands into several blocks and then check whether they contain significant coefficients. The tree structure is widely used in many fields of information sciences [30,31,32], and is also a good model for compression [33]. In [7], each block is split into four sub-blocks once it tests as significant with respective to the current threshold.…”
Section: The Btca Algorithmmentioning
confidence: 99%
“…In other words, when there is enough time to build a tree, the algorithm should choose the feature to split having the most benefit while it chooses the most effective feature in terms of time when the time is limited. However, the method has not been implemented for big data with respect to the complexity of the resulting tree [3].…”
Section: Introductionmentioning
confidence: 99%