2014
DOI: 10.2481/dsj.14-035
|View full text |Cite
|
Sign up to set email alerts
|

CHISC-AC: Compact Highest Subset Confidence-Based Associative Classification^|^sup1;

Abstract: The associative classification method integrates association rule mining and classification. Constructing an efficient classifier with a small set of high quality rules is a highly important but indeed a challenging task. The lazy learning associative classification method successfully removes the need for a classifier but suffers from high computation costs. This paper proposes a Compact Highest Subset Confidence-Based Associative Classification scheme that generates compact subsets based on information gain … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
2
2
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 17 publications
0
3
0
Order By: Relevance
“…The accuracy comparison is shown in Table 2, where dataset name is tabulated in column 1; 2nd column is the traditional method Lazy Associative Classification LAC [2], 3rd, 4th and 5th columns are existing lazy learning methods namely Lazy Associative Classification using Information gain (LACI) [9], Compact Highest Subset Confidence-based Associative Classification (CHiSC-AC) [15] and Attribute ranking based lazy learning AC [12] It can be seen in the comparison result that the proposed LLAC-WkNN system is 10.17% better than LAC, 8.23% better than LACI, 3.43% better than CHiSC and 0.40% better than ARBLazyAC respectively. Proposed LLAC-DWkNN is 13.97% better than LAC, 11.97% better than LACI, 7.00% better than CHiSC and 3.91% better than ARBLazyAC respectively.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…The accuracy comparison is shown in Table 2, where dataset name is tabulated in column 1; 2nd column is the traditional method Lazy Associative Classification LAC [2], 3rd, 4th and 5th columns are existing lazy learning methods namely Lazy Associative Classification using Information gain (LACI) [9], Compact Highest Subset Confidence-based Associative Classification (CHiSC-AC) [15] and Attribute ranking based lazy learning AC [12] It can be seen in the comparison result that the proposed LLAC-WkNN system is 10.17% better than LAC, 8.23% better than LACI, 3.43% better than CHiSC and 0.40% better than ARBLazyAC respectively. Proposed LLAC-DWkNN is 13.97% better than LAC, 11.97% better than LACI, 7.00% better than CHiSC and 3.91% better than ARBLazyAC respectively.…”
Section: Resultsmentioning
confidence: 99%
“…Decision Tree is used in [14] for the early prediction of the Diabetes. The compact subset generation method is used in ChiSC-AC method [15]. Multiple algorithms and methods are presented by multiple authors, but still there is the possibility to improve the accuracy of the classifier.…”
Section: Introductionmentioning
confidence: 99%
“…Other techniques for mining CARs have been suggested in recent years. They include GARC [5], ECR-CARM [6], CBC [7], CAR-Miner [8], CHISC-AC [9] and developed d2O [10]. The methods of classification based on CARs were demonstrated to be more accurate than the classic methods e.g.…”
Section: Introductionmentioning
confidence: 99%