2008 IEEE International Conference on Semantic Computing 2008
DOI: 10.1109/icsc.2008.70
|View full text |Cite
|
Sign up to set email alerts
|

Text Categorization Based on Boosting Association Rules

Abstract: Associative classification is a novel and powerful method originating from association rule mining. In the previous studies, a relatively small number of high-quality association rules were used in the prediction. We propose a new approach in which a large number of association rules are generated. Then, the rules are filtered using a new method which is equivalent to a deterministic Boosting algorithm. Through this equivalence, our approach effectively adapts to large-scale classification tasks such as text c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
0
1

Year Published

2010
2010
2018
2018

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 22 publications
(23 citation statements)
references
References 22 publications
0
18
0
1
Order By: Relevance
“…Another improvement, Classification based on Predictive Association Rules (CPAR), assigns weights to the observations and upon correct classification, the weight of the observation is reduced [29]. CPAR can be viewed as a precursor to boosting PARs into a classifier model, which was proposed by Yoon et al [30]. Ozgur et al [22] proposed a procedure similar to growing decision trees on association rules for regression problems and Wang et al [27] grew decision trees on PARs with minimum support of 0.…”
Section: Background and Related Workmentioning
confidence: 98%
See 1 more Smart Citation
“…Another improvement, Classification based on Predictive Association Rules (CPAR), assigns weights to the observations and upon correct classification, the weight of the observation is reduced [29]. CPAR can be viewed as a precursor to boosting PARs into a classifier model, which was proposed by Yoon et al [30]. Ozgur et al [22] proposed a procedure similar to growing decision trees on association rules for regression problems and Wang et al [27] grew decision trees on PARs with minimum support of 0.…”
Section: Background and Related Workmentioning
confidence: 98%
“…[21,20,29,30]), which is partly due to the ease of interpretation of the individual rules and partly due its good performance. Associative classification is less greedy than most other rule-based classifiers, because associative classification enumerates all PARs, while rulebased classifiers [7] or trees [4] consider only a subset of the PARs.…”
Section: Introductionmentioning
confidence: 97%
“…Table 4 depicts a comparison results between the proposed algorithm against other well-known classifiers. It should be noted that the results of the BCAR algorithm is reported in (Yoon and Lee, 2008) while for MCAR the results were obtained via experiment. Table 4 reveals that the proposed method has the highest accuracy for three of the dataset-Acq, Grain and Money-FX.…”
Section: Experiments and Resultsmentioning
confidence: 99%
“…There are many AC algorithms disseminated in literature in the last decade, such as CBA (Liu et al, 1999), CMAR (Li et al, 2001), CPAR (Han, 2003), MCAR (Thabtah et al, 2005), CACA (Tang and Liao, 2007), ACCF (Li et al, 2008), BCAR (Yoon and Lee, 2008), MAC (Abdelhamid et al, 2012) and CBAR (Han, 2003) PCBA (Chen et al, 2012). These techniques use different methods to discover the rules, sort the rules, store rules, filter out redundant rules and assigns the right class to test cases.…”
Section: Introductionmentioning
confidence: 99%
“…When applying association rules to document classification, the focus is on searching for the association of words that appear concurrently in a document. Yoon and Lee [26] used association rules for analyzing the document sets of Reuters-21578 and 20 newsgroups. They searched for rules from the training document, and then with an appropriate screener, they set up a rule database.…”
Section: Document Classificationmentioning
confidence: 99%